Apr 16 13:56:58.167791 ip-10-0-129-3 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:56:58.167802 ip-10-0-129-3 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:56:58.167809 ip-10-0-129-3 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:56:58.168061 ip-10-0-129-3 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:57:08.358090 ip-10-0-129-3 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:57:08.358115 ip-10-0-129-3 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6915460e517540c6b4e64787bb4cf3f0 -- Apr 16 13:59:36.451605 ip-10-0-129-3 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:36.870290 ip-10-0-129-3 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:36.870290 ip-10-0-129-3 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:36.870290 ip-10-0-129-3 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:36.870290 ip-10-0-129-3 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:36.870290 ip-10-0-129-3 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:36.871874 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.871780 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:36.878165 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878146 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:36.878165 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878165 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878169 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878173 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878176 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878180 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878183 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878185 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878188 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878191 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878193 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878196 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878199 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878201 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878204 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878206 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878209 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878212 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878214 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878217 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878220 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:36.878247 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878223 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878225 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878228 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878230 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878233 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878236 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878238 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878242 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878244 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878247 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878249 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878252 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878255 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878258 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878260 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878263 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878279 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878282 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878284 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878287 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:36.878741 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878290 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878293 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878295 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878299 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878301 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878303 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878306 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878308 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878313 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878318 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878320 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878323 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878325 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878329 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878331 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878334 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878336 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878339 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878343 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878345 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:36.879217 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878348 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878351 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878354 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878356 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878359 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878361 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878364 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878366 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878369 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878372 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878376 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878379 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878384 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878387 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878390 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878392 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878395 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878397 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878400 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:36.879722 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878402 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878405 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878408 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878410 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878413 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878415 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878831 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878836 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878839 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878842 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878845 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878848 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878851 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878853 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878856 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878858 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878861 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878864 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878866 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878869 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:36.880177 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878871 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878874 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878877 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878879 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878882 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878884 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878887 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878890 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878892 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878897 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878900 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878903 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878905 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878907 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878910 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878912 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878915 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878917 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878920 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:36.880666 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878923 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878925 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878928 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878930 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878933 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878936 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878939 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878942 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878945 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878947 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878949 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878952 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878954 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878957 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878959 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878962 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878965 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878967 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878969 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878973 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:36.881140 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878977 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878979 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878982 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878985 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878987 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878990 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878992 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878995 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.878998 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879001 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879004 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879006 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879009 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879011 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879014 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879016 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879020 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879022 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879025 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:36.881677 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879027 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879030 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879032 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879035 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879037 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879040 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879042 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879045 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879047 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879049 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879052 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879055 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879057 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.879060 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879142 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879150 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879159 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879166 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879173 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879178 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879185 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:36.882148 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879190 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879194 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879197 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879200 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879203 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879206 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879210 2580 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879213 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879216 2580 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879219 2580 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879222 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879225 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879229 2580 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879232 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879235 2580 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879238 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879242 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879246 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879249 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879252 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879256 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879259 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879279 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879283 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879286 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:36.882669 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879289 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879293 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879296 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879299 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879302 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879305 2580 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879317 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879977 2580 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879981 2580 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879985 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879989 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879992 2580 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879996 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.879999 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880002 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880006 2580 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880010 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880013 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880016 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880019 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880022 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880025 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880028 2580 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880032 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880035 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:36.883332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880038 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880042 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880045 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880048 2580 flags.go:64] FLAG: --help="false" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880053 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-129-3.ec2.internal" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880059 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880062 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880065 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880068 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880072 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880075 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880078 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880081 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880084 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880088 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880091 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880094 2580 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880097 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880100 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880103 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880106 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880108 2580 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880111 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880114 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:36.883936 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880118 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880124 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880127 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880130 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880133 2580 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880135 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880139 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880142 2580 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880145 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880150 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880153 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880157 2580 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880161 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880164 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880168 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880171 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880174 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880177 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880180 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880189 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880192 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880195 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880199 2580 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:36.884584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880202 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880209 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880212 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880215 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880218 2580 flags.go:64] FLAG: --port="10250" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880221 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880224 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ec35d5c9327ce747" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880228 2580 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880231 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880233 2580 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880236 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880239 2580 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880242 2580 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880245 2580 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880248 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880251 2580 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880255 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880257 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880260 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880263 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880280 2580 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880283 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880287 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880290 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880293 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880296 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:36.885131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880300 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880303 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880306 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880309 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880312 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880315 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880318 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880321 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880324 2580 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880326 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880332 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880335 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880338 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880342 2580 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880345 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880347 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880350 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880353 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880356 2580 flags.go:64] FLAG: --v="2" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880360 2580 flags.go:64] FLAG: --version="false" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880364 2580 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880369 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.880372 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880486 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880491 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:36.885774 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880494 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880497 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880499 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880502 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880505 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880507 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880511 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880514 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880517 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880520 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880522 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880525 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880527 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880530 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880533 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880535 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880537 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880540 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880543 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880545 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:36.886389 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880548 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880550 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880553 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880556 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880558 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880561 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880564 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880566 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880569 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880572 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880574 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880577 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880579 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880582 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880584 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880588 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880590 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880593 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880595 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:36.886886 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880598 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880601 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880603 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880606 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880608 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880612 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880616 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880619 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880621 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880624 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880627 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880629 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880632 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880639 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880643 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880647 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880649 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880653 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880655 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:36.887399 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880658 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880661 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880663 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880666 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880668 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880671 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880674 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880677 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880680 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880683 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880685 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880688 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880691 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880694 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880697 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880700 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880702 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880705 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880707 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880710 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:36.887861 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880712 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:36.888403 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880715 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:36.888403 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880717 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:36.888403 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880719 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:36.888403 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880722 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:36.888403 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.880724 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:36.888403 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.881458 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:36.889486 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.889461 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:36.889520 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.889488 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:36.889560 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889551 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:36.889560 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889560 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889564 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889567 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889570 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889573 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889575 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889578 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889581 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889583 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889586 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889590 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889593 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889596 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889598 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889601 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889603 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889606 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889610 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889612 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:36.889623 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889615 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889617 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889620 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889623 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889625 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889628 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889631 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889634 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889637 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889639 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889642 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889645 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889648 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889651 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889653 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889656 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889658 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889661 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889663 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889666 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:36.890080 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889668 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889671 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889675 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889679 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889683 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889686 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889688 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889691 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889693 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889696 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889698 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889701 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889705 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889708 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889712 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889715 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889718 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889720 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889723 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:36.890584 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889726 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889728 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889731 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889734 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889736 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889740 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889742 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889745 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889747 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889750 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889752 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889755 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889757 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889760 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889762 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889765 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889768 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889771 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889773 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889775 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:36.891036 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889778 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889781 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889783 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889786 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889788 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889791 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889794 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.889801 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889921 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889926 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889930 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889934 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889936 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889939 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889942 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:36.891561 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889945 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889948 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889951 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889954 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889956 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889959 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889962 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889965 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889968 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889970 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889972 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889975 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889978 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889980 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889982 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889985 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889987 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889990 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889992 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889995 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:36.891936 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.889997 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890000 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890003 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890005 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890008 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890011 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890014 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890017 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890020 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890022 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890025 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890027 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890030 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890033 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890035 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890038 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890041 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890044 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890046 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890049 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:36.892434 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890051 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890054 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890056 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890059 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890061 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890064 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890066 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890069 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890071 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890074 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890076 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890079 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890082 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890085 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890088 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890090 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890093 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890097 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890099 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890102 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:36.892922 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890104 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890107 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890111 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890114 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890116 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890120 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890124 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890127 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890130 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890133 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890135 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890138 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890141 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890143 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890146 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890148 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890151 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890153 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:36.893506 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:36.890156 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:36.893935 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.890161 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:36.893935 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.891675 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:36.895394 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.895378 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:36.896406 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.896394 2580 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:36.896519 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.896500 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:36.896558 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.896549 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:36.926874 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.926843 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:36.931097 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.931080 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:36.946309 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.946281 2580 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:36.951259 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.951244 2580 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:36.952519 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.952504 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:36.956246 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.956225 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:36.957748 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.957714 2580 fs.go:135] Filesystem UUIDs: map[05f75103-5a21-40ad-924a-e7518cb18957:/dev/nvme0n1p3 5ee151aa-4623-403b-aca2-7b5de9adeb86:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 13:59:36.957811 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.957749 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:36.963589 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.963474 2580 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:36.961501119 +0000 UTC m=+0.397666841 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100233 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec214977fddbd50a5b03c0173a8688c4 SystemUUID:ec214977-fddb-d50a-5b03-c0173a8688c4 BootID:6915460e-5175-40c6-b4e6-4787bb4cf3f0 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5c:ef:ed:b5:93 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5c:ef:ed:b5:93 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:33:84:88:6d:ae Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:36.963589 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.963584 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:36.963700 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.963674 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:36.964708 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.964682 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:36.964860 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.964712 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-3.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:36.964905 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.964870 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:36.964905 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.964879 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:36.964905 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.964892 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:36.966341 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.966328 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:36.967411 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.967400 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:36.967700 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.967689 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:36.970533 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.970522 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:36.970581 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.970542 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:36.970581 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.970556 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:36.970581 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.970566 2580 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:36.970581 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.970575 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:36.972967 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.972937 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:36.973098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.972980 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:36.977870 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.977845 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:36.979541 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.979524 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:36.981373 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981346 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:36.981373 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981353 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zmw8w" Apr 16 13:59:36.981373 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981373 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:36.981509 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981384 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:36.981509 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981392 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:36.981509 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981398 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:36.981509 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981405 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:36.981509 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981411 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:36.981509 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981418 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:36.981509 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981427 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:36.981509 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981433 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:36.981509 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981442 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:36.981509 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.981452 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:36.982318 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.982306 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:36.982318 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.982317 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:36.983261 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:36.983224 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:36.983405 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:36.983384 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-3.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:36.986096 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.986082 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:36.986144 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.986124 2580 server.go:1295] "Started kubelet" Apr 16 13:59:36.986229 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.986207 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:36.986345 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.986306 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:36.986394 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.986376 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:36.986795 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.986775 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zmw8w" Apr 16 13:59:36.987098 ip-10-0-129-3 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:36.987472 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.987458 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:36.988967 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.988951 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:36.992613 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.992594 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:36.993195 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.993176 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:36.995391 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:36.994323 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:36.995391 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.994355 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:36.995391 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.994365 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:36.995391 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.994510 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:36.995391 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.994518 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:36.995391 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.994826 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:36.995724 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.995707 2580 factory.go:55] Registering systemd factory Apr 16 13:59:36.995768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.995729 2580 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:36.995923 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.995907 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:36.995977 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.995945 2580 factory.go:153] Registering CRI-O factory Apr 16 13:59:36.995977 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.995961 2580 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:36.996047 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.996025 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:36.996099 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.996064 2580 factory.go:103] Registering Raw factory Apr 16 13:59:36.996099 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.996079 2580 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:36.997087 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:36.997068 2580 manager.go:319] Starting recovery of all containers Apr 16 13:59:37.000241 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.000211 2580 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-3.ec2.internal\" not found" node="ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.000374 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.000299 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-3.ec2.internal" not found Apr 16 13:59:37.000374 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.000333 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:59:37.007932 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.007803 2580 manager.go:324] Recovery completed Apr 16 13:59:37.012514 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.012500 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:37.014975 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.014955 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:37.015070 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.014994 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:37.015070 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.015010 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:37.015617 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.015598 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:37.015617 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.015616 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:37.015726 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.015636 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:37.016608 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.016592 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-3.ec2.internal" not found Apr 16 13:59:37.017677 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.017665 2580 policy_none.go:49] "None policy: Start" Apr 16 13:59:37.017721 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.017682 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:37.017721 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.017692 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:37.043972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.043954 2580 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:37.044075 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.044001 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:37.044075 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.044011 2580 server.go:85] "Starting device plugin registration server" Apr 16 13:59:37.044315 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.044299 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:37.044423 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.044314 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:37.044484 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.044430 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:37.044576 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.044560 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:37.044576 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.044573 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:37.045044 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.045027 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:37.045121 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.045070 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:37.073245 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.073222 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-3.ec2.internal" not found Apr 16 13:59:37.145055 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.144987 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:37.145738 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.145710 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:37.146104 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.146085 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:37.146174 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.146144 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:37.146174 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.146156 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:37.146244 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.146184 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.147343 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.147325 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:37.147436 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.147356 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:37.147436 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.147377 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:37.147436 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.147388 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:37.147436 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.147430 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:37.149638 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.149613 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:37.159963 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.159935 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.159963 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.159962 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-3.ec2.internal\": node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:37.201567 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.201538 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:37.247584 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.247535 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal"] Apr 16 13:59:37.247691 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.247663 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:37.249567 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.249549 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:37.249674 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.249578 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:37.249674 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.249587 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:37.250719 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.250706 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:37.250863 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.250849 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.250907 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.250882 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:37.251472 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.251450 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:37.251566 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.251481 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:37.251566 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.251491 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:37.251566 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.251454 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:37.251566 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.251550 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:37.251566 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.251567 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:37.253303 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.253287 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.253379 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.253320 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:37.254012 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.253994 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:37.254111 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.254023 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:37.254111 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.254036 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:37.273192 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.273168 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-3.ec2.internal\" not found" node="ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.278118 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.278103 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-3.ec2.internal\" not found" node="ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.296389 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.296365 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43533fa9c9e65a453c1bf1eee9f41f6d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal\" (UID: \"43533fa9c9e65a453c1bf1eee9f41f6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.296495 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.296394 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43533fa9c9e65a453c1bf1eee9f41f6d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal\" (UID: \"43533fa9c9e65a453c1bf1eee9f41f6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.296495 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.296416 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0a36747d36072480352ac0833ae0f93c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-3.ec2.internal\" (UID: \"0a36747d36072480352ac0833ae0f93c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.302462 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.302444 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:37.397446 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.397364 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43533fa9c9e65a453c1bf1eee9f41f6d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal\" (UID: \"43533fa9c9e65a453c1bf1eee9f41f6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.397446 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.397406 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43533fa9c9e65a453c1bf1eee9f41f6d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal\" (UID: \"43533fa9c9e65a453c1bf1eee9f41f6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.397446 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.397430 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0a36747d36072480352ac0833ae0f93c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-3.ec2.internal\" (UID: \"0a36747d36072480352ac0833ae0f93c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.397642 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.397473 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43533fa9c9e65a453c1bf1eee9f41f6d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal\" (UID: \"43533fa9c9e65a453c1bf1eee9f41f6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.397642 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.397473 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43533fa9c9e65a453c1bf1eee9f41f6d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal\" (UID: \"43533fa9c9e65a453c1bf1eee9f41f6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.397642 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.397477 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0a36747d36072480352ac0833ae0f93c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-3.ec2.internal\" (UID: \"0a36747d36072480352ac0833ae0f93c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.402923 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.402904 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:37.503543 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.503510 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:37.576766 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.576726 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.581167 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.581149 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal" Apr 16 13:59:37.603623 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.603592 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:37.704306 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.704196 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:37.804820 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.804786 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:37.896356 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.896318 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:37.896943 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.896502 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:37.896943 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.896513 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:37.905597 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:37.905574 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:37.989249 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.989207 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:36 +0000 UTC" deadline="2027-12-24 23:25:08.301074128 +0000 UTC" Apr 16 13:59:37.989249 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.989242 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14817h25m30.311834846s" Apr 16 13:59:37.993362 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:37.993349 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:38.006182 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:38.006149 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:38.021553 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.021524 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:38.041764 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.041735 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4db22" Apr 16 13:59:38.051855 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.051837 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4db22" Apr 16 13:59:38.085614 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:38.085582 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43533fa9c9e65a453c1bf1eee9f41f6d.slice/crio-a842c22fb3835e41d175d8e27db45d25a825b65294052dc46e4cd2cb5d8d713a WatchSource:0}: Error finding container a842c22fb3835e41d175d8e27db45d25a825b65294052dc46e4cd2cb5d8d713a: Status 404 returned error can't find the container with id a842c22fb3835e41d175d8e27db45d25a825b65294052dc46e4cd2cb5d8d713a Apr 16 13:59:38.085950 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:38.085936 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a36747d36072480352ac0833ae0f93c.slice/crio-6b82a444506844d486d2a1421bb58a099ce7f5f95a31adc20daeacd4165d2698 WatchSource:0}: Error finding container 6b82a444506844d486d2a1421bb58a099ce7f5f95a31adc20daeacd4165d2698: Status 404 returned error can't find the container with id 6b82a444506844d486d2a1421bb58a099ce7f5f95a31adc20daeacd4165d2698 Apr 16 13:59:38.089823 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.089808 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:38.106881 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:38.106854 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:38.151053 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.151002 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal" event={"ID":"0a36747d36072480352ac0833ae0f93c","Type":"ContainerStarted","Data":"6b82a444506844d486d2a1421bb58a099ce7f5f95a31adc20daeacd4165d2698"} Apr 16 13:59:38.151955 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.151933 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" event={"ID":"43533fa9c9e65a453c1bf1eee9f41f6d","Type":"ContainerStarted","Data":"a842c22fb3835e41d175d8e27db45d25a825b65294052dc46e4cd2cb5d8d713a"} Apr 16 13:59:38.207516 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:38.207487 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-3.ec2.internal\" not found" Apr 16 13:59:38.252714 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.252643 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:38.259565 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.258316 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:38.293461 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.293424 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal" Apr 16 13:59:38.303919 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.303896 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:38.304699 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.304685 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" Apr 16 13:59:38.314571 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.314547 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:38.703463 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.703369 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:38.971735 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.971662 2580 apiserver.go:52] "Watching apiserver" Apr 16 13:59:38.979779 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.979751 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:38.980863 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.980835 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-69zrg","kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg","openshift-cluster-node-tuning-operator/tuned-wk4ng","openshift-image-registry/node-ca-xwnfr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal","openshift-multus/multus-h7xrd","openshift-multus/network-metrics-daemon-cptr8","openshift-dns/node-resolver-thzkh","openshift-multus/multus-additional-cni-plugins-86xbr","openshift-network-diagnostics/network-check-target-zg9zc","openshift-network-operator/iptables-alerter-wvs7j","openshift-ovn-kubernetes/ovnkube-node-kgf8n"] Apr 16 13:59:38.985160 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.985135 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h7xrd" Apr 16 13:59:38.988004 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.987456 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:38.988354 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.988200 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:38.988354 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.988200 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:38.992074 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.988705 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:38.992074 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.991768 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:38.992074 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.991801 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:38.992074 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.991809 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:38.992074 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.991948 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wnt4r\"" Apr 16 13:59:38.992074 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.992024 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-s8lpb\"" Apr 16 13:59:38.993664 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.993638 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:38.996599 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.996570 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:38.996815 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.996767 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-69zrg" Apr 16 13:59:38.997312 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.997296 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:38.997312 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.997307 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:38.997445 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.997344 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xbllr\"" Apr 16 13:59:38.997445 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.997418 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:38.998913 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:38.998890 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:38.999019 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:38.998962 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:39.001081 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.001061 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.001787 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.001765 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:39.002368 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.002347 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:39.002529 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.002508 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:39.002678 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.002664 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9mjv8\"" Apr 16 13:59:39.002818 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.002800 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:39.002953 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.002934 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:39.003108 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.003078 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mj4m8\"" Apr 16 13:59:39.003615 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.003594 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-25hf8\"" Apr 16 13:59:39.003835 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.003821 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:39.003946 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.003934 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:39.004659 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.004639 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005420 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-hostroot\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005455 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-conf-dir\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005491 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005525 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-registration-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005545 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-sysctl-d\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-tuned\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005602 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-cni-dir\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005671 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-socket-dir-parent\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005700 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-run-k8s-cni-cncf-io\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-system-cni-dir\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005744 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-daemon-config\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005759 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-etc-selinux\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005781 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-sys-fs\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005805 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4ae873fc-9131-409f-a02c-21eb56f20fed-konnectivity-ca\") pod \"konnectivity-agent-69zrg\" (UID: \"4ae873fc-9131-409f-a02c-21eb56f20fed\") " pod="kube-system/konnectivity-agent-69zrg" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005818 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-etc-kubernetes\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005832 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-modprobe-d\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.006564 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005846 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-sys\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-var-lib-kubelet\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005882 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-host\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005895 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-tmp\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005908 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd-serviceca\") pod \"node-ca-xwnfr\" (UID: \"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd\") " pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005922 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgrv6\" (UniqueName: \"kubernetes.io/projected/e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd-kube-api-access-kgrv6\") pod \"node-ca-xwnfr\" (UID: \"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd\") " pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-run-netns\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005951 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb694\" (UniqueName: \"kubernetes.io/projected/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-kube-api-access-vb694\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005965 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-socket-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005983 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-sysctl-conf\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.005997 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-systemd\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006021 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4ae873fc-9131-409f-a02c-21eb56f20fed-agent-certs\") pod \"konnectivity-agent-69zrg\" (UID: \"4ae873fc-9131-409f-a02c-21eb56f20fed\") " pod="kube-system/konnectivity-agent-69zrg" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006036 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-var-lib-cni-multus\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006049 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-var-lib-kubelet\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006081 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs7cj\" (UniqueName: \"kubernetes.io/projected/19aa0590-52b8-463f-a21b-db3a6833a0ca-kube-api-access-fs7cj\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006094 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-sysconfig\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006109 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-kubernetes\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.007432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006122 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-run\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006139 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-cnibin\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006161 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-cni-binary-copy\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006183 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-var-lib-cni-bin\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006213 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-device-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006232 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-os-release\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006282 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-run-multus-certs\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006308 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-lib-modules\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006334 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk8kx\" (UniqueName: \"kubernetes.io/projected/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-kube-api-access-hk8kx\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006379 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd-host\") pod \"node-ca-xwnfr\" (UID: \"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd\") " pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.006754 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-n5s5b\"" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.007231 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:39.008126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.007565 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:39.009511 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.009061 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:39.009511 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.009131 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:39.017769 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.017742 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.018923 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.018903 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.022240 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.021915 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:39.022240 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.021990 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:39.022240 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.022235 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:39.022494 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.022472 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:39.022855 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.022779 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:39.023143 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.023015 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:39.023438 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.023193 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vqjl5\"" Apr 16 13:59:39.023438 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.023245 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:39.023438 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.023348 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hgr6q\"" Apr 16 13:59:39.023438 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.023425 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:39.023438 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.023434 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:39.052559 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.052494 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:38 +0000 UTC" deadline="2027-10-31 20:20:55.207337042 +0000 UTC" Apr 16 13:59:39.052559 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.052517 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13518h21m16.154822276s" Apr 16 13:59:39.096986 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.096959 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:39.107240 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107202 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-socket-dir-parent\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.107432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107246 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-ovn-node-metrics-cert\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.107432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107292 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.107432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107320 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:39.107432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107347 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-system-cni-dir\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.107432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107373 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4ae873fc-9131-409f-a02c-21eb56f20fed-konnectivity-ca\") pod \"konnectivity-agent-69zrg\" (UID: \"4ae873fc-9131-409f-a02c-21eb56f20fed\") " pod="kube-system/konnectivity-agent-69zrg" Apr 16 13:59:39.107432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107398 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9pgl\" (UniqueName: \"kubernetes.io/projected/aef30458-23ff-40ab-ad5a-ae58af58ca82-kube-api-access-f9pgl\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:39.107432 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107419 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-run-netns\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107443 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd-serviceca\") pod \"node-ca-xwnfr\" (UID: \"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd\") " pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107468 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgrv6\" (UniqueName: \"kubernetes.io/projected/e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd-kube-api-access-kgrv6\") pod \"node-ca-xwnfr\" (UID: \"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd\") " pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107494 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-kubelet\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107519 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-cni-netd\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107544 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-os-release\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107569 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/acddcee2-ab55-4a6b-8b63-9793ffc842d3-cni-binary-copy\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107608 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-run-netns\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107635 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb694\" (UniqueName: \"kubernetes.io/projected/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-kube-api-access-vb694\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107660 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-systemd\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107686 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-systemd-units\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107726 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-sysconfig\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.107768 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107750 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-run\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107775 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/acddcee2-ab55-4a6b-8b63-9793ffc842d3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107801 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-cni-binary-copy\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107850 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-var-lib-cni-bin\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107877 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f34922d5-7862-4337-9408-0036909c6059-host-slash\") pod \"iptables-alerter-wvs7j\" (UID: \"f34922d5-7862-4337-9408-0036909c6059\") " pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107903 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-node-log\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107928 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-lib-modules\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107955 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk8kx\" (UniqueName: \"kubernetes.io/projected/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-kube-api-access-hk8kx\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.107985 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-env-overrides\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-conf-dir\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108050 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108080 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-registration-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108105 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-sysctl-d\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108132 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxfh5\" (UniqueName: \"kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5\") pod \"network-check-target-zg9zc\" (UID: \"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb\") " pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108162 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-etc-openvswitch\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108189 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-ovnkube-config\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108214 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-ovnkube-script-lib\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.108349 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108240 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-cni-dir\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108288 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-run-k8s-cni-cncf-io\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108328 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f34922d5-7862-4337-9408-0036909c6059-iptables-alerter-script\") pod \"iptables-alerter-wvs7j\" (UID: \"f34922d5-7862-4337-9408-0036909c6059\") " pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108353 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-cni-bin\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/acddcee2-ab55-4a6b-8b63-9793ffc842d3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108410 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2a769c2-0080-45a0-983a-5c1bcf200faf-tmp-dir\") pod \"node-resolver-thzkh\" (UID: \"f2a769c2-0080-45a0-983a-5c1bcf200faf\") " pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108438 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-daemon-config\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108466 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-etc-selinux\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108507 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-sys-fs\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108535 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-run-openvswitch\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108563 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108590 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2a769c2-0080-45a0-983a-5c1bcf200faf-hosts-file\") pod \"node-resolver-thzkh\" (UID: \"f2a769c2-0080-45a0-983a-5c1bcf200faf\") " pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108618 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-etc-kubernetes\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108649 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-modprobe-d\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108673 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-sys\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108698 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-var-lib-kubelet\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108723 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-host\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.109098 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108748 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-tmp\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4ae873fc-9131-409f-a02c-21eb56f20fed-agent-certs\") pod \"konnectivity-agent-69zrg\" (UID: \"4ae873fc-9131-409f-a02c-21eb56f20fed\") " pod="kube-system/konnectivity-agent-69zrg" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108803 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4xwr\" (UniqueName: \"kubernetes.io/projected/f34922d5-7862-4337-9408-0036909c6059-kube-api-access-h4xwr\") pod \"iptables-alerter-wvs7j\" (UID: \"f34922d5-7862-4337-9408-0036909c6059\") " pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108832 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-socket-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-sysctl-conf\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108887 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-var-lib-openvswitch\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108912 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-run-ovn\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108936 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-run-ovn-kubernetes\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108964 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfvqn\" (UniqueName: \"kubernetes.io/projected/acddcee2-ab55-4a6b-8b63-9793ffc842d3-kube-api-access-zfvqn\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.108997 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-var-lib-cni-multus\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109023 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-var-lib-kubelet\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109051 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fs7cj\" (UniqueName: \"kubernetes.io/projected/19aa0590-52b8-463f-a21b-db3a6833a0ca-kube-api-access-fs7cj\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109081 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-kubernetes\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109117 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9mk\" (UniqueName: \"kubernetes.io/projected/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-kube-api-access-qq9mk\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109145 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h62x8\" (UniqueName: \"kubernetes.io/projected/f2a769c2-0080-45a0-983a-5c1bcf200faf-kube-api-access-h62x8\") pod \"node-resolver-thzkh\" (UID: \"f2a769c2-0080-45a0-983a-5c1bcf200faf\") " pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109176 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-cnibin\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109203 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-device-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.109861 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109230 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-run-systemd\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109285 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-os-release\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109313 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-run-multus-certs\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109367 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd-host\") pod \"node-ca-xwnfr\" (UID: \"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd\") " pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109420 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-slash\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-log-socket\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109477 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-hostroot\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109505 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-tuned\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109533 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-system-cni-dir\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109561 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-cnibin\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109676 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-socket-dir-parent\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.109749 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-system-cni-dir\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110086 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-etc-selinux\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110101 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-lib-modules\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110156 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-sys-fs\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110171 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-etc-kubernetes\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110287 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-modprobe-d\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110342 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-sys\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.110643 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110387 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-var-lib-kubelet\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110412 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-conf-dir\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110443 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110446 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-host\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110512 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-registration-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-run-k8s-cni-cncf-io\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110728 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-sysctl-d\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110766 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.110778 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-run-netns\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111092 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-sysconfig\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111195 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd-serviceca\") pod \"node-ca-xwnfr\" (UID: \"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd\") " pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111368 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-systemd\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111439 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-socket-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111515 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-cnibin\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111523 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-sysctl-conf\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111562 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/19aa0590-52b8-463f-a21b-db3a6833a0ca-device-dir\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111588 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-var-lib-cni-multus\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111625 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-os-release\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112066 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111628 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-var-lib-kubelet\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112910 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111660 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-run-multus-certs\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112910 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111693 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd-host\") pod \"node-ca-xwnfr\" (UID: \"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd\") " pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:39.112910 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111737 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-hostroot\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112910 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111864 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-kubernetes\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.112910 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111907 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-run\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.112910 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.111951 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-host-var-lib-cni-bin\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112910 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.112405 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-cni-binary-copy\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112910 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.112496 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4ae873fc-9131-409f-a02c-21eb56f20fed-konnectivity-ca\") pod \"konnectivity-agent-69zrg\" (UID: \"4ae873fc-9131-409f-a02c-21eb56f20fed\") " pod="kube-system/konnectivity-agent-69zrg" Apr 16 13:59:39.112910 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.112597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-cni-dir\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.112910 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.112640 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-multus-daemon-config\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.115635 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.115536 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-tmp\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.115854 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.115832 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4ae873fc-9131-409f-a02c-21eb56f20fed-agent-certs\") pod \"konnectivity-agent-69zrg\" (UID: \"4ae873fc-9131-409f-a02c-21eb56f20fed\") " pod="kube-system/konnectivity-agent-69zrg" Apr 16 13:59:39.116466 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.116435 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-etc-tuned\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.122500 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.122476 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk8kx\" (UniqueName: \"kubernetes.io/projected/5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb-kube-api-access-hk8kx\") pod \"tuned-wk4ng\" (UID: \"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb\") " pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.123448 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.123158 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb694\" (UniqueName: \"kubernetes.io/projected/5fa66a35-a4c8-4e4b-a65c-58bfea71f741-kube-api-access-vb694\") pod \"multus-h7xrd\" (UID: \"5fa66a35-a4c8-4e4b-a65c-58bfea71f741\") " pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.123448 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.123237 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs7cj\" (UniqueName: \"kubernetes.io/projected/19aa0590-52b8-463f-a21b-db3a6833a0ca-kube-api-access-fs7cj\") pod \"aws-ebs-csi-driver-node-xjqgg\" (UID: \"19aa0590-52b8-463f-a21b-db3a6833a0ca\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.124387 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.124353 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgrv6\" (UniqueName: \"kubernetes.io/projected/e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd-kube-api-access-kgrv6\") pod \"node-ca-xwnfr\" (UID: \"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd\") " pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.209887 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-env-overrides\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.209948 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfh5\" (UniqueName: \"kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5\") pod \"network-check-target-zg9zc\" (UID: \"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb\") " pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.209974 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-etc-openvswitch\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.209999 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-ovnkube-config\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210016 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-ovnkube-script-lib\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210031 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f34922d5-7862-4337-9408-0036909c6059-iptables-alerter-script\") pod \"iptables-alerter-wvs7j\" (UID: \"f34922d5-7862-4337-9408-0036909c6059\") " pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210052 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-cni-bin\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/acddcee2-ab55-4a6b-8b63-9793ffc842d3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2a769c2-0080-45a0-983a-5c1bcf200faf-tmp-dir\") pod \"node-resolver-thzkh\" (UID: \"f2a769c2-0080-45a0-983a-5c1bcf200faf\") " pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210110 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-run-openvswitch\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210130 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210167 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2a769c2-0080-45a0-983a-5c1bcf200faf-hosts-file\") pod \"node-resolver-thzkh\" (UID: \"f2a769c2-0080-45a0-983a-5c1bcf200faf\") " pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210196 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4xwr\" (UniqueName: \"kubernetes.io/projected/f34922d5-7862-4337-9408-0036909c6059-kube-api-access-h4xwr\") pod \"iptables-alerter-wvs7j\" (UID: \"f34922d5-7862-4337-9408-0036909c6059\") " pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210219 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-var-lib-openvswitch\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210238 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-run-ovn\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210284 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-run-ovn-kubernetes\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.211972 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210311 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfvqn\" (UniqueName: \"kubernetes.io/projected/acddcee2-ab55-4a6b-8b63-9793ffc842d3-kube-api-access-zfvqn\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210330 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9mk\" (UniqueName: \"kubernetes.io/projected/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-kube-api-access-qq9mk\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210345 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h62x8\" (UniqueName: \"kubernetes.io/projected/f2a769c2-0080-45a0-983a-5c1bcf200faf-kube-api-access-h62x8\") pod \"node-resolver-thzkh\" (UID: \"f2a769c2-0080-45a0-983a-5c1bcf200faf\") " pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210369 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-run-systemd\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210396 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-slash\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210417 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-log-socket\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210436 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-system-cni-dir\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210451 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-cnibin\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210467 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-ovn-node-metrics-cert\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210483 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210499 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210516 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9pgl\" (UniqueName: \"kubernetes.io/projected/aef30458-23ff-40ab-ad5a-ae58af58ca82-kube-api-access-f9pgl\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210540 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-run-netns\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210575 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-kubelet\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210601 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-cni-netd\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210626 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-os-release\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210649 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/acddcee2-ab55-4a6b-8b63-9793ffc842d3-cni-binary-copy\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.212880 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210673 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-systemd-units\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/acddcee2-ab55-4a6b-8b63-9793ffc842d3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f34922d5-7862-4337-9408-0036909c6059-host-slash\") pod \"iptables-alerter-wvs7j\" (UID: \"f34922d5-7862-4337-9408-0036909c6059\") " pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210784 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-node-log\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.210870 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-node-log\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.211404 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-env-overrides\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.211473 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-run-systemd\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.211518 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-slash\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.211554 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-etc-openvswitch\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.211560 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-log-socket\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.211597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-system-cni-dir\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.211631 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-cnibin\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.212536 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-ovnkube-config\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.212547 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-ovnkube-script-lib\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.212599 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.212678 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs podName:aef30458-23ff-40ab-ad5a-ae58af58ca82 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.712655652 +0000 UTC m=+3.148821381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs") pod "network-metrics-daemon-cptr8" (UID: "aef30458-23ff-40ab-ad5a-ae58af58ca82") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.212995 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-run-netns\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.213649 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213028 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-kubelet\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213058 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-cni-netd\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213078 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f34922d5-7862-4337-9408-0036909c6059-iptables-alerter-script\") pod \"iptables-alerter-wvs7j\" (UID: \"f34922d5-7862-4337-9408-0036909c6059\") " pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213089 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/acddcee2-ab55-4a6b-8b63-9793ffc842d3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213102 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-os-release\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213283 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2a769c2-0080-45a0-983a-5c1bcf200faf-tmp-dir\") pod \"node-resolver-thzkh\" (UID: \"f2a769c2-0080-45a0-983a-5c1bcf200faf\") " pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213320 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-run-openvswitch\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213347 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213379 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2a769c2-0080-45a0-983a-5c1bcf200faf-hosts-file\") pod \"node-resolver-thzkh\" (UID: \"f2a769c2-0080-45a0-983a-5c1bcf200faf\") " pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213457 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/acddcee2-ab55-4a6b-8b63-9793ffc842d3-cni-binary-copy\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213489 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-run-ovn\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213497 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-var-lib-openvswitch\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213534 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-systemd-units\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213738 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-run-ovn-kubernetes\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.212605 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-host-cni-bin\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213775 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f34922d5-7862-4337-9408-0036909c6059-host-slash\") pod \"iptables-alerter-wvs7j\" (UID: \"f34922d5-7862-4337-9408-0036909c6059\") " pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213814 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/acddcee2-ab55-4a6b-8b63-9793ffc842d3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.214420 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.213871 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/acddcee2-ab55-4a6b-8b63-9793ffc842d3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.215998 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.215971 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-ovn-node-metrics-cert\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.220796 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.220774 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:39.220897 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.220800 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:39.220897 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.220813 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pxfh5 for pod openshift-network-diagnostics/network-check-target-zg9zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:39.220897 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.220877 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5 podName:a4cc786e-e069-4dfc-9be8-98f1a73b9bcb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.720858694 +0000 UTC m=+3.157024405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pxfh5" (UniqueName: "kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5") pod "network-check-target-zg9zc" (UID: "a4cc786e-e069-4dfc-9be8-98f1a73b9bcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:39.224658 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.224593 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h62x8\" (UniqueName: \"kubernetes.io/projected/f2a769c2-0080-45a0-983a-5c1bcf200faf-kube-api-access-h62x8\") pod \"node-resolver-thzkh\" (UID: \"f2a769c2-0080-45a0-983a-5c1bcf200faf\") " pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.225775 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.225735 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9mk\" (UniqueName: \"kubernetes.io/projected/c6aa762b-ffdd-496f-8282-ff45ebe8c26c-kube-api-access-qq9mk\") pod \"ovnkube-node-kgf8n\" (UID: \"c6aa762b-ffdd-496f-8282-ff45ebe8c26c\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.226979 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.226950 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9pgl\" (UniqueName: \"kubernetes.io/projected/aef30458-23ff-40ab-ad5a-ae58af58ca82-kube-api-access-f9pgl\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:39.227072 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.226981 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfvqn\" (UniqueName: \"kubernetes.io/projected/acddcee2-ab55-4a6b-8b63-9793ffc842d3-kube-api-access-zfvqn\") pod \"multus-additional-cni-plugins-86xbr\" (UID: \"acddcee2-ab55-4a6b-8b63-9793ffc842d3\") " pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.229097 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.229057 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4xwr\" (UniqueName: \"kubernetes.io/projected/f34922d5-7862-4337-9408-0036909c6059-kube-api-access-h4xwr\") pod \"iptables-alerter-wvs7j\" (UID: \"f34922d5-7862-4337-9408-0036909c6059\") " pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.302693 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.302653 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h7xrd" Apr 16 13:59:39.324166 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.324135 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" Apr 16 13:59:39.333917 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.333886 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xwnfr" Apr 16 13:59:39.339516 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.339487 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" Apr 16 13:59:39.348084 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.348059 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-69zrg" Apr 16 13:59:39.355854 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.355834 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-thzkh" Apr 16 13:59:39.357019 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.356998 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:39.364317 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.364292 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-86xbr" Apr 16 13:59:39.374018 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.373996 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wvs7j" Apr 16 13:59:39.380779 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.380758 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 13:59:39.713234 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.713144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:39.713414 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.713308 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:39.713414 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.713386 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs podName:aef30458-23ff-40ab-ad5a-ae58af58ca82 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.713363236 +0000 UTC m=+4.149528955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs") pod "network-metrics-daemon-cptr8" (UID: "aef30458-23ff-40ab-ad5a-ae58af58ca82") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:39.814311 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:39.814251 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfh5\" (UniqueName: \"kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5\") pod \"network-check-target-zg9zc\" (UID: \"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb\") " pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:39.814482 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.814439 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:39.814482 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.814460 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:39.814482 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.814477 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pxfh5 for pod openshift-network-diagnostics/network-check-target-zg9zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:39.814607 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:39.814541 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5 podName:a4cc786e-e069-4dfc-9be8-98f1a73b9bcb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.814525776 +0000 UTC m=+4.250691482 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxfh5" (UniqueName: "kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5") pod "network-check-target-zg9zc" (UID: "a4cc786e-e069-4dfc-9be8-98f1a73b9bcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:39.963410 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:39.963337 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fa66a35_a4c8_4e4b_a65c_58bfea71f741.slice/crio-ff481c231942501ebd37c7340fdd42a774dc21ea37771db8fb36876a3f7c8045 WatchSource:0}: Error finding container ff481c231942501ebd37c7340fdd42a774dc21ea37771db8fb36876a3f7c8045: Status 404 returned error can't find the container with id ff481c231942501ebd37c7340fdd42a774dc21ea37771db8fb36876a3f7c8045 Apr 16 13:59:40.014585 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:40.014546 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2dce2aa_0a8d_4795_a48e_c9cb5cf26cfd.slice/crio-260f2c5e650e67e7c250c2c4c817fa681ef4c989d4268582a7ac37a3fd41c4a1 WatchSource:0}: Error finding container 260f2c5e650e67e7c250c2c4c817fa681ef4c989d4268582a7ac37a3fd41c4a1: Status 404 returned error can't find the container with id 260f2c5e650e67e7c250c2c4c817fa681ef4c989d4268582a7ac37a3fd41c4a1 Apr 16 13:59:40.015646 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:40.015609 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2a769c2_0080_45a0_983a_5c1bcf200faf.slice/crio-615a217a99f820befd171ee3e1efdabbbd3e9371cdd2e4c53caac4e423fe3eff WatchSource:0}: Error finding container 615a217a99f820befd171ee3e1efdabbbd3e9371cdd2e4c53caac4e423fe3eff: Status 404 returned error can't find the container with id 615a217a99f820befd171ee3e1efdabbbd3e9371cdd2e4c53caac4e423fe3eff Apr 16 13:59:40.018918 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:40.018875 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf34922d5_7862_4337_9408_0036909c6059.slice/crio-4393559d01889527062317b174eea1dd894d9fbbe8ddf8106b093616688a1799 WatchSource:0}: Error finding container 4393559d01889527062317b174eea1dd894d9fbbe8ddf8106b093616688a1799: Status 404 returned error can't find the container with id 4393559d01889527062317b174eea1dd894d9fbbe8ddf8106b093616688a1799 Apr 16 13:59:40.019482 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:40.019462 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacddcee2_ab55_4a6b_8b63_9793ffc842d3.slice/crio-c746170888bf9748e17d7ebc675db919b3149352dfd69248f3743307062756fd WatchSource:0}: Error finding container c746170888bf9748e17d7ebc675db919b3149352dfd69248f3743307062756fd: Status 404 returned error can't find the container with id c746170888bf9748e17d7ebc675db919b3149352dfd69248f3743307062756fd Apr 16 13:59:40.041105 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:40.041074 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bc4aec8_4f86_44bd_9c9f_67ea41c4dcdb.slice/crio-219f6641f359fa16bc479c36359b9e25be4c98d08f5818ddbb7b76bda331dce3 WatchSource:0}: Error finding container 219f6641f359fa16bc479c36359b9e25be4c98d08f5818ddbb7b76bda331dce3: Status 404 returned error can't find the container with id 219f6641f359fa16bc479c36359b9e25be4c98d08f5818ddbb7b76bda331dce3 Apr 16 13:59:40.042593 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:40.042568 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae873fc_9131_409f_a02c_21eb56f20fed.slice/crio-f364c3aaa69dd27f0f2b3b0b838d863cad9dd46decf0bc818fd1819223ff4a88 WatchSource:0}: Error finding container f364c3aaa69dd27f0f2b3b0b838d863cad9dd46decf0bc818fd1819223ff4a88: Status 404 returned error can't find the container with id f364c3aaa69dd27f0f2b3b0b838d863cad9dd46decf0bc818fd1819223ff4a88 Apr 16 13:59:40.043423 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:40.043320 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19aa0590_52b8_463f_a21b_db3a6833a0ca.slice/crio-1d021e0da0fe648eccde8c6ab9203613914d21be70c4e70915431991e9763c85 WatchSource:0}: Error finding container 1d021e0da0fe648eccde8c6ab9203613914d21be70c4e70915431991e9763c85: Status 404 returned error can't find the container with id 1d021e0da0fe648eccde8c6ab9203613914d21be70c4e70915431991e9763c85 Apr 16 13:59:40.043782 ip-10-0-129-3 kubenswrapper[2580]: W0416 13:59:40.043759 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6aa762b_ffdd_496f_8282_ff45ebe8c26c.slice/crio-2e6c8ec5d463d882670771eedb4668accf0193c8d71dd26a10cf42869c433c05 WatchSource:0}: Error finding container 2e6c8ec5d463d882670771eedb4668accf0193c8d71dd26a10cf42869c433c05: Status 404 returned error can't find the container with id 2e6c8ec5d463d882670771eedb4668accf0193c8d71dd26a10cf42869c433c05 Apr 16 13:59:40.053435 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.053404 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:38 +0000 UTC" deadline="2027-09-28 06:20:02.532185376 +0000 UTC" Apr 16 13:59:40.053435 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.053435 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12712h20m22.478753535s" Apr 16 13:59:40.156723 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.156685 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal" event={"ID":"0a36747d36072480352ac0833ae0f93c","Type":"ContainerStarted","Data":"a02c72f883d201bc6038e33a1f86333fad767b053ceffe1193db0c219eb65578"} Apr 16 13:59:40.157608 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.157584 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xwnfr" event={"ID":"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd","Type":"ContainerStarted","Data":"260f2c5e650e67e7c250c2c4c817fa681ef4c989d4268582a7ac37a3fd41c4a1"} Apr 16 13:59:40.158490 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.158473 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h7xrd" event={"ID":"5fa66a35-a4c8-4e4b-a65c-58bfea71f741","Type":"ContainerStarted","Data":"ff481c231942501ebd37c7340fdd42a774dc21ea37771db8fb36876a3f7c8045"} Apr 16 13:59:40.160198 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.160179 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" event={"ID":"c6aa762b-ffdd-496f-8282-ff45ebe8c26c","Type":"ContainerStarted","Data":"2e6c8ec5d463d882670771eedb4668accf0193c8d71dd26a10cf42869c433c05"} Apr 16 13:59:40.160982 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.160962 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-69zrg" event={"ID":"4ae873fc-9131-409f-a02c-21eb56f20fed","Type":"ContainerStarted","Data":"f364c3aaa69dd27f0f2b3b0b838d863cad9dd46decf0bc818fd1819223ff4a88"} Apr 16 13:59:40.161783 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.161757 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" event={"ID":"19aa0590-52b8-463f-a21b-db3a6833a0ca","Type":"ContainerStarted","Data":"1d021e0da0fe648eccde8c6ab9203613914d21be70c4e70915431991e9763c85"} Apr 16 13:59:40.162708 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.162687 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" event={"ID":"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb","Type":"ContainerStarted","Data":"219f6641f359fa16bc479c36359b9e25be4c98d08f5818ddbb7b76bda331dce3"} Apr 16 13:59:40.163603 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.163584 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86xbr" event={"ID":"acddcee2-ab55-4a6b-8b63-9793ffc842d3","Type":"ContainerStarted","Data":"c746170888bf9748e17d7ebc675db919b3149352dfd69248f3743307062756fd"} Apr 16 13:59:40.164435 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.164416 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wvs7j" event={"ID":"f34922d5-7862-4337-9408-0036909c6059","Type":"ContainerStarted","Data":"4393559d01889527062317b174eea1dd894d9fbbe8ddf8106b093616688a1799"} Apr 16 13:59:40.165360 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.165340 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-thzkh" event={"ID":"f2a769c2-0080-45a0-983a-5c1bcf200faf","Type":"ContainerStarted","Data":"615a217a99f820befd171ee3e1efdabbbd3e9371cdd2e4c53caac4e423fe3eff"} Apr 16 13:59:40.174488 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.174447 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-3.ec2.internal" podStartSLOduration=2.17443475 podStartE2EDuration="2.17443475s" podCreationTimestamp="2026-04-16 13:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:40.173975224 +0000 UTC m=+3.610140952" watchObservedRunningTime="2026-04-16 13:59:40.17443475 +0000 UTC m=+3.610600478" Apr 16 13:59:40.721433 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.721384 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:40.721607 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:40.721563 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:40.721674 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:40.721626 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs podName:aef30458-23ff-40ab-ad5a-ae58af58ca82 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:42.721608282 +0000 UTC m=+6.157774002 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs") pod "network-metrics-daemon-cptr8" (UID: "aef30458-23ff-40ab-ad5a-ae58af58ca82") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:40.822470 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:40.822379 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfh5\" (UniqueName: \"kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5\") pod \"network-check-target-zg9zc\" (UID: \"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb\") " pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:40.822649 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:40.822632 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:40.822729 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:40.822656 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:40.822729 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:40.822669 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pxfh5 for pod openshift-network-diagnostics/network-check-target-zg9zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:40.822834 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:40.822730 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5 podName:a4cc786e-e069-4dfc-9be8-98f1a73b9bcb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:42.822710584 +0000 UTC m=+6.258876304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxfh5" (UniqueName: "kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5") pod "network-check-target-zg9zc" (UID: "a4cc786e-e069-4dfc-9be8-98f1a73b9bcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:41.150368 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:41.150339 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:41.150815 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:41.150440 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:41.150815 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:41.150728 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:41.150815 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:41.150783 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:41.177201 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:41.177169 2580 generic.go:358] "Generic (PLEG): container finished" podID="43533fa9c9e65a453c1bf1eee9f41f6d" containerID="c2085fcf19ad511f4c8d44f784e592564d3e5bc1859c19cb9bf940f3635086aa" exitCode=0 Apr 16 13:59:41.178150 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:41.178123 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" event={"ID":"43533fa9c9e65a453c1bf1eee9f41f6d","Type":"ContainerDied","Data":"c2085fcf19ad511f4c8d44f784e592564d3e5bc1859c19cb9bf940f3635086aa"} Apr 16 13:59:42.202239 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:42.202207 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" event={"ID":"43533fa9c9e65a453c1bf1eee9f41f6d","Type":"ContainerStarted","Data":"e396f1cdfb4cce436ed9973dcfd7dbf59b46214c05fee08559412fa65fdf2c30"} Apr 16 13:59:42.736720 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:42.736675 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:42.736893 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:42.736843 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:42.737008 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:42.736927 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs podName:aef30458-23ff-40ab-ad5a-ae58af58ca82 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:46.736907276 +0000 UTC m=+10.173072996 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs") pod "network-metrics-daemon-cptr8" (UID: "aef30458-23ff-40ab-ad5a-ae58af58ca82") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:42.837853 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:42.837785 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfh5\" (UniqueName: \"kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5\") pod \"network-check-target-zg9zc\" (UID: \"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb\") " pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:42.838027 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:42.837986 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:42.838027 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:42.838006 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:42.838027 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:42.838028 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pxfh5 for pod openshift-network-diagnostics/network-check-target-zg9zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:42.838181 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:42.838085 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5 podName:a4cc786e-e069-4dfc-9be8-98f1a73b9bcb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:46.838066573 +0000 UTC m=+10.274232293 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxfh5" (UniqueName: "kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5") pod "network-check-target-zg9zc" (UID: "a4cc786e-e069-4dfc-9be8-98f1a73b9bcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:43.150235 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:43.150200 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:43.150439 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:43.150362 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:43.150752 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:43.150732 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:43.150848 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:43.150827 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:45.148731 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:45.148306 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:45.148731 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:45.148462 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:45.148731 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:45.148488 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:45.148731 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:45.148558 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:46.778262 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:46.778210 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:46.778706 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:46.778421 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:46.778706 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:46.778499 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs podName:aef30458-23ff-40ab-ad5a-ae58af58ca82 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:54.778478004 +0000 UTC m=+18.214643728 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs") pod "network-metrics-daemon-cptr8" (UID: "aef30458-23ff-40ab-ad5a-ae58af58ca82") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:46.879294 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:46.879236 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfh5\" (UniqueName: \"kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5\") pod \"network-check-target-zg9zc\" (UID: \"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb\") " pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:46.879495 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:46.879426 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:46.879495 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:46.879444 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:46.879495 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:46.879456 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pxfh5 for pod openshift-network-diagnostics/network-check-target-zg9zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:46.879608 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:46.879519 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5 podName:a4cc786e-e069-4dfc-9be8-98f1a73b9bcb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:54.879498019 +0000 UTC m=+18.315663736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxfh5" (UniqueName: "kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5") pod "network-check-target-zg9zc" (UID: "a4cc786e-e069-4dfc-9be8-98f1a73b9bcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:47.149152 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:47.149123 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:47.149384 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:47.149233 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:47.149700 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:47.149675 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:47.149803 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:47.149774 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:49.148665 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:49.148393 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:49.148665 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:49.148542 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:49.149166 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:49.149017 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:49.149166 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:49.149113 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:51.148332 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:51.148263 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:51.148768 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:51.148435 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:51.148840 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:51.148776 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:51.148890 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:51.148845 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:53.148179 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:53.148146 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:53.148677 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:53.148152 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:53.148677 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:53.148302 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:53.148677 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:53.148391 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:54.832309 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:54.832250 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:54.832788 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:54.832448 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:54.832788 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:54.832504 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs podName:aef30458-23ff-40ab-ad5a-ae58af58ca82 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:10.832490182 +0000 UTC m=+34.268655887 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs") pod "network-metrics-daemon-cptr8" (UID: "aef30458-23ff-40ab-ad5a-ae58af58ca82") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:54.933512 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:54.933468 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfh5\" (UniqueName: \"kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5\") pod \"network-check-target-zg9zc\" (UID: \"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb\") " pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:54.933699 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:54.933613 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:54.933699 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:54.933633 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:54.933699 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:54.933642 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pxfh5 for pod openshift-network-diagnostics/network-check-target-zg9zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:54.933699 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:54.933699 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5 podName:a4cc786e-e069-4dfc-9be8-98f1a73b9bcb nodeName:}" failed. No retries permitted until 2026-04-16 14:00:10.933685204 +0000 UTC m=+34.369850910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxfh5" (UniqueName: "kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5") pod "network-check-target-zg9zc" (UID: "a4cc786e-e069-4dfc-9be8-98f1a73b9bcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:55.147947 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:55.147907 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:55.148236 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:55.148066 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:55.148542 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:55.148518 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:55.148633 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:55.148611 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:57.148992 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:57.148954 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:57.149458 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:57.149075 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:57.149458 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:57.149125 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:57.149458 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:57.149206 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:58.232437 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.231923 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xwnfr" event={"ID":"e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd","Type":"ContainerStarted","Data":"5e896392e753e4a8b7dd6d8fa4d32ec78df1047f9ea981c980ed24305f29363e"} Apr 16 13:59:58.233458 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.233431 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h7xrd" event={"ID":"5fa66a35-a4c8-4e4b-a65c-58bfea71f741","Type":"ContainerStarted","Data":"10091a40cfe6082887f34c2c57da6d9e27f0fa4a6e30b0947f1623f1804b63c9"} Apr 16 13:59:58.236386 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.236360 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" event={"ID":"c6aa762b-ffdd-496f-8282-ff45ebe8c26c","Type":"ContainerStarted","Data":"e40ed5230fb505dc66b131b44a3e62e62c96fe21194ad8220eef6b09201c4163"} Apr 16 13:59:58.236501 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.236390 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" event={"ID":"c6aa762b-ffdd-496f-8282-ff45ebe8c26c","Type":"ContainerStarted","Data":"96c5ee088319576a430912d0c666ef16d67bb17b23bb487bcecd53dbd6598253"} Apr 16 13:59:58.236501 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.236406 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" event={"ID":"c6aa762b-ffdd-496f-8282-ff45ebe8c26c","Type":"ContainerStarted","Data":"9d02b0933fec3f9fd250160ea8e3d0abdb8a438cb9767d0f069e7180596dd9bc"} Apr 16 13:59:58.236501 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.236419 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" event={"ID":"c6aa762b-ffdd-496f-8282-ff45ebe8c26c","Type":"ContainerStarted","Data":"97dea2f6a6ba8bcdfa0aa90e7bd4a85bcc5d9db33024c711ac8cec3aafabf7e0"} Apr 16 13:59:58.236501 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.236432 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" event={"ID":"c6aa762b-ffdd-496f-8282-ff45ebe8c26c","Type":"ContainerStarted","Data":"e64d7311923d52a118a25339e2a4f334b8f406492c59cd5ab6b005816f8609f1"} Apr 16 13:59:58.236501 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.236443 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" event={"ID":"c6aa762b-ffdd-496f-8282-ff45ebe8c26c","Type":"ContainerStarted","Data":"7e25df24a19564c1c324171e7336fb6c539354dd57f9533be41989fae06b7eb2"} Apr 16 13:59:58.237766 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.237740 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-69zrg" event={"ID":"4ae873fc-9131-409f-a02c-21eb56f20fed","Type":"ContainerStarted","Data":"cef4cb7c6b0dcf788aa912fd65b24cb791da601f4fdd1be54b1cd5320c11985b"} Apr 16 13:59:58.239126 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.239099 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" event={"ID":"19aa0590-52b8-463f-a21b-db3a6833a0ca","Type":"ContainerStarted","Data":"fd78c34646f0bea3f31ba18f2d4122da0bf0212feb324a3dd98a227e2cf84867"} Apr 16 13:59:58.240381 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.240353 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" event={"ID":"5bc4aec8-4f86-44bd-9c9f-67ea41c4dcdb","Type":"ContainerStarted","Data":"e536f18a90c87f5e679b8bc539740bea994e6af1d57ede7658367104e4037682"} Apr 16 13:59:58.241707 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.241687 2580 generic.go:358] "Generic (PLEG): container finished" podID="acddcee2-ab55-4a6b-8b63-9793ffc842d3" containerID="a082ffec1ecdf01391c4051a8800892065d4b46631ef5a9000f0f81c65646ada" exitCode=0 Apr 16 13:59:58.241821 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.241747 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86xbr" event={"ID":"acddcee2-ab55-4a6b-8b63-9793ffc842d3","Type":"ContainerDied","Data":"a082ffec1ecdf01391c4051a8800892065d4b46631ef5a9000f0f81c65646ada"} Apr 16 13:59:58.242968 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.242943 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-thzkh" event={"ID":"f2a769c2-0080-45a0-983a-5c1bcf200faf","Type":"ContainerStarted","Data":"bfb7a1e90d8147d0e68bf35a445bfb00fcae4dfd4d5d00328795e7bd27431a33"} Apr 16 13:59:58.247597 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.247088 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xwnfr" podStartSLOduration=3.991034958 podStartE2EDuration="21.247074032s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 13:59:40.017372758 +0000 UTC m=+3.453538468" lastFinishedPulling="2026-04-16 13:59:57.273411828 +0000 UTC m=+20.709577542" observedRunningTime="2026-04-16 13:59:58.246643581 +0000 UTC m=+21.682809310" watchObservedRunningTime="2026-04-16 13:59:58.247074032 +0000 UTC m=+21.683239760" Apr 16 13:59:58.247597 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.247179 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-3.ec2.internal" podStartSLOduration=20.247172565 podStartE2EDuration="20.247172565s" podCreationTimestamp="2026-04-16 13:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:42.222949941 +0000 UTC m=+5.659115672" watchObservedRunningTime="2026-04-16 13:59:58.247172565 +0000 UTC m=+21.683338295" Apr 16 13:59:58.280193 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.280145 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-69zrg" podStartSLOduration=4.054695792 podStartE2EDuration="21.280132479s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 13:59:40.046972828 +0000 UTC m=+3.483138547" lastFinishedPulling="2026-04-16 13:59:57.272409514 +0000 UTC m=+20.708575234" observedRunningTime="2026-04-16 13:59:58.279814994 +0000 UTC m=+21.715980722" watchObservedRunningTime="2026-04-16 13:59:58.280132479 +0000 UTC m=+21.716298206" Apr 16 13:59:58.280437 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.280411 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h7xrd" podStartSLOduration=3.931754737 podStartE2EDuration="21.280405981s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 13:59:39.965435656 +0000 UTC m=+3.401601369" lastFinishedPulling="2026-04-16 13:59:57.314086907 +0000 UTC m=+20.750252613" observedRunningTime="2026-04-16 13:59:58.264304538 +0000 UTC m=+21.700470269" watchObservedRunningTime="2026-04-16 13:59:58.280405981 +0000 UTC m=+21.716571709" Apr 16 13:59:58.295678 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.295629 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-thzkh" podStartSLOduration=4.040578115 podStartE2EDuration="21.295615158s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 13:59:40.017405676 +0000 UTC m=+3.453571384" lastFinishedPulling="2026-04-16 13:59:57.272442717 +0000 UTC m=+20.708608427" observedRunningTime="2026-04-16 13:59:58.295469536 +0000 UTC m=+21.731635265" watchObservedRunningTime="2026-04-16 13:59:58.295615158 +0000 UTC m=+21.731780887" Apr 16 13:59:58.343096 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.343046 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wk4ng" podStartSLOduration=4.11057811 podStartE2EDuration="21.343030634s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 13:59:40.042741516 +0000 UTC m=+3.478907229" lastFinishedPulling="2026-04-16 13:59:57.275194035 +0000 UTC m=+20.711359753" observedRunningTime="2026-04-16 13:59:58.342972969 +0000 UTC m=+21.779138698" watchObservedRunningTime="2026-04-16 13:59:58.343030634 +0000 UTC m=+21.779196361" Apr 16 13:59:58.714434 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:58.714412 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:59.055754 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:59.055661 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:58.714429311Z","UUID":"f2cab93f-5ed3-498b-908d-cbe86ba6fac6","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:59.057230 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:59.057212 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:59.057230 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:59.057236 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:59.148131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:59.148090 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 13:59:59.148131 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:59.148116 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 13:59:59.148381 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:59.148213 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 13:59:59.148381 ip-10-0-129-3 kubenswrapper[2580]: E0416 13:59:59.148344 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 13:59:59.246058 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:59.246018 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" event={"ID":"19aa0590-52b8-463f-a21b-db3a6833a0ca","Type":"ContainerStarted","Data":"65dd913bb962ca01bc5d82e116c20c6bdc1f9c5e16b34468fdccadb6100d1450"} Apr 16 13:59:59.247606 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:59.247384 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wvs7j" event={"ID":"f34922d5-7862-4337-9408-0036909c6059","Type":"ContainerStarted","Data":"ecfe82a06371a90ac09aea12419f6b959c254232d8bc0c61085548cf68a0844b"} Apr 16 13:59:59.262010 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:59.261956 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wvs7j" podStartSLOduration=5.029330245 podStartE2EDuration="22.261941224s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 13:59:40.039829079 +0000 UTC m=+3.475994785" lastFinishedPulling="2026-04-16 13:59:57.272440047 +0000 UTC m=+20.708605764" observedRunningTime="2026-04-16 13:59:59.261480014 +0000 UTC m=+22.697645742" watchObservedRunningTime="2026-04-16 13:59:59.261941224 +0000 UTC m=+22.698106951" Apr 16 13:59:59.632230 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:59.632191 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-69zrg" Apr 16 13:59:59.632801 ip-10-0-129-3 kubenswrapper[2580]: I0416 13:59:59.632782 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-69zrg" Apr 16 14:00:00.253419 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:00.253324 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" event={"ID":"c6aa762b-ffdd-496f-8282-ff45ebe8c26c","Type":"ContainerStarted","Data":"06b9f123073874b6438efd32e3fb87216885d1490e8385f86efbea4fe041e549"} Apr 16 14:00:00.256075 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:00.256044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" event={"ID":"19aa0590-52b8-463f-a21b-db3a6833a0ca","Type":"ContainerStarted","Data":"eb5e3a6ff6c8f848fa269616baa2aa8a352ec1f509123c80725015ee81c4747d"} Apr 16 14:00:00.256441 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:00.256414 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-69zrg" Apr 16 14:00:00.256861 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:00.256842 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-69zrg" Apr 16 14:00:00.277057 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:00.276989 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xjqgg" podStartSLOduration=3.871669972 podStartE2EDuration="23.276961595s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 13:59:40.046909821 +0000 UTC m=+3.483075543" lastFinishedPulling="2026-04-16 13:59:59.452201457 +0000 UTC m=+22.888367166" observedRunningTime="2026-04-16 14:00:00.275664272 +0000 UTC m=+23.711830001" watchObservedRunningTime="2026-04-16 14:00:00.276961595 +0000 UTC m=+23.713127323" Apr 16 14:00:01.148347 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:01.148316 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:00:01.148509 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:01.148330 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:01.148509 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:01.148459 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 14:00:01.148629 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:01.148533 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 14:00:03.148428 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:03.148191 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:00:03.149008 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:03.148191 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:03.149008 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:03.148524 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 14:00:03.149008 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:03.148617 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 14:00:04.267103 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:04.266811 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" event={"ID":"c6aa762b-ffdd-496f-8282-ff45ebe8c26c","Type":"ContainerStarted","Data":"fb4601dd183853ffd6bd6e644fc1dcd8cc5f7adb01d04645438ccd1faadbbe8f"} Apr 16 14:00:04.267103 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:04.267048 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 14:00:04.268651 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:04.268626 2580 generic.go:358] "Generic (PLEG): container finished" podID="acddcee2-ab55-4a6b-8b63-9793ffc842d3" containerID="2b7523fa5ae27d7ee3130b66063e2023ce6d16ade5dced601568fad24662f3b6" exitCode=0 Apr 16 14:00:04.268765 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:04.268668 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86xbr" event={"ID":"acddcee2-ab55-4a6b-8b63-9793ffc842d3","Type":"ContainerDied","Data":"2b7523fa5ae27d7ee3130b66063e2023ce6d16ade5dced601568fad24662f3b6"} Apr 16 14:00:04.282896 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:04.282871 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 14:00:04.297870 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:04.297806 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" podStartSLOduration=9.868575364 podStartE2EDuration="27.297780593s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 13:59:40.046855124 +0000 UTC m=+3.483020833" lastFinishedPulling="2026-04-16 13:59:57.476060337 +0000 UTC m=+20.912226062" observedRunningTime="2026-04-16 14:00:04.29425796 +0000 UTC m=+27.730423678" watchObservedRunningTime="2026-04-16 14:00:04.297780593 +0000 UTC m=+27.733946337" Apr 16 14:00:05.149138 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:05.149105 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:05.149348 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:05.149247 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 14:00:05.149658 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:05.149632 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:00:05.149767 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:05.149746 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 14:00:05.271252 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:05.271214 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 14:00:05.271702 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:05.271295 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 14:00:05.285912 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:05.285888 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 14:00:05.427648 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:05.427572 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zg9zc"] Apr 16 14:00:05.427858 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:05.427684 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:05.427858 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:05.427794 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 14:00:05.430375 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:05.430292 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cptr8"] Apr 16 14:00:05.430492 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:05.430386 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:00:05.430554 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:05.430498 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 14:00:06.274513 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:06.274481 2580 generic.go:358] "Generic (PLEG): container finished" podID="acddcee2-ab55-4a6b-8b63-9793ffc842d3" containerID="8abc0977a05f96e20e65d6aad64085260cdbd05235bbacddd369bd1a2f7eec9d" exitCode=0 Apr 16 14:00:06.274919 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:06.274563 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86xbr" event={"ID":"acddcee2-ab55-4a6b-8b63-9793ffc842d3","Type":"ContainerDied","Data":"8abc0977a05f96e20e65d6aad64085260cdbd05235bbacddd369bd1a2f7eec9d"} Apr 16 14:00:07.149089 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:07.149051 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:00:07.149243 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:07.149147 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 14:00:07.149243 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:07.149232 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:07.149377 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:07.149356 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 14:00:09.149681 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:09.148715 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:00:09.149681 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:09.149092 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cptr8" podUID="aef30458-23ff-40ab-ad5a-ae58af58ca82" Apr 16 14:00:09.149681 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:09.149549 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:09.149681 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:09.149642 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zg9zc" podUID="a4cc786e-e069-4dfc-9be8-98f1a73b9bcb" Apr 16 14:00:09.283704 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:09.283670 2580 generic.go:358] "Generic (PLEG): container finished" podID="acddcee2-ab55-4a6b-8b63-9793ffc842d3" containerID="56e67d065aef5cf878aec01acb8dd0801382b2e1aa43871d6481b0bdaafd43b5" exitCode=0 Apr 16 14:00:09.283868 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:09.283728 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86xbr" event={"ID":"acddcee2-ab55-4a6b-8b63-9793ffc842d3","Type":"ContainerDied","Data":"56e67d065aef5cf878aec01acb8dd0801382b2e1aa43871d6481b0bdaafd43b5"} Apr 16 14:00:10.411603 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.411575 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-3.ec2.internal" event="NodeReady" Apr 16 14:00:10.412154 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.411710 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:00:10.461901 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.461855 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-s7ltv"] Apr 16 14:00:10.490443 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.490410 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bnz8h"] Apr 16 14:00:10.490796 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.490596 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.493597 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.493428 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5ngv\"" Apr 16 14:00:10.493709 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.493698 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:00:10.493769 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.493751 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:00:10.511226 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.511188 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s7ltv"] Apr 16 14:00:10.511226 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.511215 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bnz8h"] Apr 16 14:00:10.511442 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.511334 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:10.514290 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.514233 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:00:10.514413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.514240 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-r2bfh\"" Apr 16 14:00:10.514413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.514313 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:00:10.514413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.514242 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:00:10.645447 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.645409 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:10.645447 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.645445 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22mcq\" (UniqueName: \"kubernetes.io/projected/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-kube-api-access-22mcq\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:10.645670 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.645470 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6da34735-0aa6-4efc-88b2-81738c442f3f-config-volume\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.645670 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.645532 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6da34735-0aa6-4efc-88b2-81738c442f3f-tmp-dir\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.645670 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.645563 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.645670 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.645586 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmrvj\" (UniqueName: \"kubernetes.io/projected/6da34735-0aa6-4efc-88b2-81738c442f3f-kube-api-access-hmrvj\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.746793 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.746707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:10.746793 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.746746 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22mcq\" (UniqueName: \"kubernetes.io/projected/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-kube-api-access-22mcq\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:10.746793 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.746771 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6da34735-0aa6-4efc-88b2-81738c442f3f-config-volume\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.747067 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.746797 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6da34735-0aa6-4efc-88b2-81738c442f3f-tmp-dir\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.747067 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:10.746869 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:10.747067 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:10.746940 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert podName:4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:11.246917882 +0000 UTC m=+34.683083588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert") pod "ingress-canary-bnz8h" (UID: "4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8") : secret "canary-serving-cert" not found Apr 16 14:00:10.747067 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.747052 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.747289 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.747083 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmrvj\" (UniqueName: \"kubernetes.io/projected/6da34735-0aa6-4efc-88b2-81738c442f3f-kube-api-access-hmrvj\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.747289 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.747161 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6da34735-0aa6-4efc-88b2-81738c442f3f-tmp-dir\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.747289 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:10.747168 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:10.747289 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:10.747241 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls podName:6da34735-0aa6-4efc-88b2-81738c442f3f nodeName:}" failed. No retries permitted until 2026-04-16 14:00:11.2472226 +0000 UTC m=+34.683388307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls") pod "dns-default-s7ltv" (UID: "6da34735-0aa6-4efc-88b2-81738c442f3f") : secret "dns-default-metrics-tls" not found Apr 16 14:00:10.747454 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.747434 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6da34735-0aa6-4efc-88b2-81738c442f3f-config-volume\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.759805 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.759773 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmrvj\" (UniqueName: \"kubernetes.io/projected/6da34735-0aa6-4efc-88b2-81738c442f3f-kube-api-access-hmrvj\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:10.759971 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.759937 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22mcq\" (UniqueName: \"kubernetes.io/projected/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-kube-api-access-22mcq\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:10.848142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.848102 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:00:10.848371 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:10.848288 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:00:10.848371 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:10.848357 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs podName:aef30458-23ff-40ab-ad5a-ae58af58ca82 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:42.84834224 +0000 UTC m=+66.284507950 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs") pod "network-metrics-daemon-cptr8" (UID: "aef30458-23ff-40ab-ad5a-ae58af58ca82") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:00:10.949446 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:10.949404 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfh5\" (UniqueName: \"kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5\") pod \"network-check-target-zg9zc\" (UID: \"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb\") " pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:10.949613 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:10.949549 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:00:10.949613 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:10.949574 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:00:10.949613 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:10.949587 2580 projected.go:194] Error preparing data for projected volume kube-api-access-pxfh5 for pod openshift-network-diagnostics/network-check-target-zg9zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:00:10.949730 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:10.949647 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5 podName:a4cc786e-e069-4dfc-9be8-98f1a73b9bcb nodeName:}" failed. No retries permitted until 2026-04-16 14:00:42.949629307 +0000 UTC m=+66.385795029 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxfh5" (UniqueName: "kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5") pod "network-check-target-zg9zc" (UID: "a4cc786e-e069-4dfc-9be8-98f1a73b9bcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:00:11.147994 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:11.147963 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:11.148188 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:11.147957 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:00:11.153786 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:11.153750 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:11.154101 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:11.154074 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:11.155077 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:11.155054 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m8ds6\"" Apr 16 14:00:11.155197 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:11.155095 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2qtwk\"" Apr 16 14:00:11.155197 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:11.155116 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:11.251895 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:11.251848 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:11.252067 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:11.251900 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:11.252067 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:11.252047 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:11.252146 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:11.252069 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:11.252146 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:11.252121 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls podName:6da34735-0aa6-4efc-88b2-81738c442f3f nodeName:}" failed. No retries permitted until 2026-04-16 14:00:12.252100998 +0000 UTC m=+35.688266712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls") pod "dns-default-s7ltv" (UID: "6da34735-0aa6-4efc-88b2-81738c442f3f") : secret "dns-default-metrics-tls" not found Apr 16 14:00:11.252146 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:11.252139 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert podName:4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:12.252129929 +0000 UTC m=+35.688295634 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert") pod "ingress-canary-bnz8h" (UID: "4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8") : secret "canary-serving-cert" not found Apr 16 14:00:12.259731 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:12.259693 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:12.259731 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:12.259738 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:12.260429 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:12.259829 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:12.260429 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:12.259850 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:12.260429 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:12.259884 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls podName:6da34735-0aa6-4efc-88b2-81738c442f3f nodeName:}" failed. No retries permitted until 2026-04-16 14:00:14.259869783 +0000 UTC m=+37.696035491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls") pod "dns-default-s7ltv" (UID: "6da34735-0aa6-4efc-88b2-81738c442f3f") : secret "dns-default-metrics-tls" not found Apr 16 14:00:12.260429 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:12.259902 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert podName:4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:14.25989383 +0000 UTC m=+37.696059536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert") pod "ingress-canary-bnz8h" (UID: "4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8") : secret "canary-serving-cert" not found Apr 16 14:00:14.275521 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:14.275485 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:14.276109 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:14.275538 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:14.276109 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:14.275642 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:14.276109 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:14.275715 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:14.276109 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:14.275743 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert podName:4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:18.275720391 +0000 UTC m=+41.711886112 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert") pod "ingress-canary-bnz8h" (UID: "4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8") : secret "canary-serving-cert" not found Apr 16 14:00:14.276109 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:14.275768 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls podName:6da34735-0aa6-4efc-88b2-81738c442f3f nodeName:}" failed. No retries permitted until 2026-04-16 14:00:18.275752621 +0000 UTC m=+41.711918343 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls") pod "dns-default-s7ltv" (UID: "6da34735-0aa6-4efc-88b2-81738c442f3f") : secret "dns-default-metrics-tls" not found Apr 16 14:00:15.298618 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:15.298588 2580 generic.go:358] "Generic (PLEG): container finished" podID="acddcee2-ab55-4a6b-8b63-9793ffc842d3" containerID="c167171d616baf7c930941cdf33653582dab3e9777643f07377e06d948c833bf" exitCode=0 Apr 16 14:00:15.299043 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:15.298635 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86xbr" event={"ID":"acddcee2-ab55-4a6b-8b63-9793ffc842d3","Type":"ContainerDied","Data":"c167171d616baf7c930941cdf33653582dab3e9777643f07377e06d948c833bf"} Apr 16 14:00:16.302731 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:16.302699 2580 generic.go:358] "Generic (PLEG): container finished" podID="acddcee2-ab55-4a6b-8b63-9793ffc842d3" containerID="92fabb0797ac450a46e1eb10b7a12175946cabb9304c04a2386fb18e7972d4c5" exitCode=0 Apr 16 14:00:16.303131 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:16.302766 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86xbr" event={"ID":"acddcee2-ab55-4a6b-8b63-9793ffc842d3","Type":"ContainerDied","Data":"92fabb0797ac450a46e1eb10b7a12175946cabb9304c04a2386fb18e7972d4c5"} Apr 16 14:00:17.306908 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:17.306873 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86xbr" event={"ID":"acddcee2-ab55-4a6b-8b63-9793ffc842d3","Type":"ContainerStarted","Data":"8fd2651feeb784728c0d590d22a334db8f25d9ea6528447917f18101f98622d8"} Apr 16 14:00:17.333998 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:17.333942 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-86xbr" podStartSLOduration=5.441049629 podStartE2EDuration="40.333927638s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 13:59:40.039783049 +0000 UTC m=+3.475948759" lastFinishedPulling="2026-04-16 14:00:14.932661048 +0000 UTC m=+38.368826768" observedRunningTime="2026-04-16 14:00:17.332470667 +0000 UTC m=+40.768636395" watchObservedRunningTime="2026-04-16 14:00:17.333927638 +0000 UTC m=+40.770093363" Apr 16 14:00:18.305087 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:18.305037 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:18.305317 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:18.305137 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:18.305317 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:18.305196 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:18.305317 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:18.305225 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:18.305317 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:18.305263 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls podName:6da34735-0aa6-4efc-88b2-81738c442f3f nodeName:}" failed. No retries permitted until 2026-04-16 14:00:26.305248455 +0000 UTC m=+49.741414166 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls") pod "dns-default-s7ltv" (UID: "6da34735-0aa6-4efc-88b2-81738c442f3f") : secret "dns-default-metrics-tls" not found Apr 16 14:00:18.305317 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:18.305301 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert podName:4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:26.305294777 +0000 UTC m=+49.741460486 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert") pod "ingress-canary-bnz8h" (UID: "4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8") : secret "canary-serving-cert" not found Apr 16 14:00:26.362946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:26.362905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:26.362946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:26.362951 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:26.363609 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:26.363094 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:26.363609 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:26.363145 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:26.363609 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:26.363194 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls podName:6da34735-0aa6-4efc-88b2-81738c442f3f nodeName:}" failed. No retries permitted until 2026-04-16 14:00:42.363179529 +0000 UTC m=+65.799345234 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls") pod "dns-default-s7ltv" (UID: "6da34735-0aa6-4efc-88b2-81738c442f3f") : secret "dns-default-metrics-tls" not found Apr 16 14:00:26.363609 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:26.363219 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert podName:4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:42.363200398 +0000 UTC m=+65.799366118 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert") pod "ingress-canary-bnz8h" (UID: "4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8") : secret "canary-serving-cert" not found Apr 16 14:00:37.287502 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:37.287476 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgf8n" Apr 16 14:00:42.373583 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:42.373549 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:00:42.373583 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:42.373586 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:00:42.374018 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:42.373679 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:42.374018 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:42.373684 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:42.374018 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:42.373736 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls podName:6da34735-0aa6-4efc-88b2-81738c442f3f nodeName:}" failed. No retries permitted until 2026-04-16 14:01:14.37372188 +0000 UTC m=+97.809887591 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls") pod "dns-default-s7ltv" (UID: "6da34735-0aa6-4efc-88b2-81738c442f3f") : secret "dns-default-metrics-tls" not found Apr 16 14:00:42.374018 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:42.373748 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert podName:4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:14.373743076 +0000 UTC m=+97.809908785 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert") pod "ingress-canary-bnz8h" (UID: "4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8") : secret "canary-serving-cert" not found Apr 16 14:00:42.876397 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:42.876363 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:00:42.879077 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:42.879061 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:42.887167 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:42.887147 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:00:42.887224 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:00:42.887211 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs podName:aef30458-23ff-40ab-ad5a-ae58af58ca82 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:46.887195706 +0000 UTC m=+130.323361412 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs") pod "network-metrics-daemon-cptr8" (UID: "aef30458-23ff-40ab-ad5a-ae58af58ca82") : secret "metrics-daemon-secret" not found Apr 16 14:00:42.977510 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:42.977456 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfh5\" (UniqueName: \"kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5\") pod \"network-check-target-zg9zc\" (UID: \"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb\") " pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:42.980728 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:42.980704 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:42.990705 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:42.990677 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:43.001383 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:43.001350 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxfh5\" (UniqueName: \"kubernetes.io/projected/a4cc786e-e069-4dfc-9be8-98f1a73b9bcb-kube-api-access-pxfh5\") pod \"network-check-target-zg9zc\" (UID: \"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb\") " pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:43.265951 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:43.265870 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m8ds6\"" Apr 16 14:00:43.274141 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:43.274109 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:43.399927 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:43.399895 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zg9zc"] Apr 16 14:00:43.403841 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:00:43.403813 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4cc786e_e069_4dfc_9be8_98f1a73b9bcb.slice/crio-3250e3b5f202c1c8d3387dd7a662534f495c6da4634450e043cbfe18db628717 WatchSource:0}: Error finding container 3250e3b5f202c1c8d3387dd7a662534f495c6da4634450e043cbfe18db628717: Status 404 returned error can't find the container with id 3250e3b5f202c1c8d3387dd7a662534f495c6da4634450e043cbfe18db628717 Apr 16 14:00:44.355763 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:44.355717 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zg9zc" event={"ID":"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb","Type":"ContainerStarted","Data":"3250e3b5f202c1c8d3387dd7a662534f495c6da4634450e043cbfe18db628717"} Apr 16 14:00:47.362503 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:47.362463 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zg9zc" event={"ID":"a4cc786e-e069-4dfc-9be8-98f1a73b9bcb","Type":"ContainerStarted","Data":"4b9bebbd25b65a6cd8069d675904732af5ca49d5d58b0cb6ce7edeca01cb5292"} Apr 16 14:00:47.362974 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:47.362604 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:00:47.378754 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:00:47.378698 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zg9zc" podStartSLOduration=67.341756651 podStartE2EDuration="1m10.378683996s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 14:00:43.406119882 +0000 UTC m=+66.842285591" lastFinishedPulling="2026-04-16 14:00:46.443047231 +0000 UTC m=+69.879212936" observedRunningTime="2026-04-16 14:00:47.377633291 +0000 UTC m=+70.813799018" watchObservedRunningTime="2026-04-16 14:00:47.378683996 +0000 UTC m=+70.814849723" Apr 16 14:01:14.389608 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:14.389450 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:01:14.389608 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:14.389513 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:01:14.389608 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:14.389594 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:01:14.390137 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:14.389635 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:01:14.390137 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:14.389678 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert podName:4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8 nodeName:}" failed. No retries permitted until 2026-04-16 14:02:18.389662905 +0000 UTC m=+161.825828610 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert") pod "ingress-canary-bnz8h" (UID: "4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8") : secret "canary-serving-cert" not found Apr 16 14:01:14.390137 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:14.389701 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls podName:6da34735-0aa6-4efc-88b2-81738c442f3f nodeName:}" failed. No retries permitted until 2026-04-16 14:02:18.389684462 +0000 UTC m=+161.825850173 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls") pod "dns-default-s7ltv" (UID: "6da34735-0aa6-4efc-88b2-81738c442f3f") : secret "dns-default-metrics-tls" not found Apr 16 14:01:15.704807 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.704770 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-24gbs"] Apr 16 14:01:15.708853 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.708828 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.711484 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.711461 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 14:01:15.711606 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.711461 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:01:15.711606 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.711470 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 14:01:15.712410 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.712385 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-jq72r\"" Apr 16 14:01:15.712615 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.712601 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:01:15.716603 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.716582 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 14:01:15.719178 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.719160 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-24gbs"] Apr 16 14:01:15.798104 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.798067 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-799pg\" (UniqueName: \"kubernetes.io/projected/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-kube-api-access-799pg\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.798104 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.798108 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-serving-cert\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.798376 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.798139 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-snapshots\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.798376 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.798222 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-tmp\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.798376 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.798283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.798376 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.798318 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.810939 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.810891 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n"] Apr 16 14:01:15.813724 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.813704 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-mqmmd"] Apr 16 14:01:15.813893 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.813877 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:15.816050 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.816031 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx"] Apr 16 14:01:15.816226 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.816186 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:15.816629 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.816606 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-mljr8\"" Apr 16 14:01:15.816771 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.816754 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:01:15.816911 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.816891 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 14:01:15.817024 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.816980 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 14:01:15.817098 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.817082 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 14:01:15.818501 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.818484 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:15.818650 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.818629 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:01:15.818940 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.818924 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 14:01:15.819425 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.819404 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 14:01:15.819883 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.819857 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 14:01:15.821011 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.820990 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 14:01:15.821240 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.821223 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-mrxdq\"" Apr 16 14:01:15.821340 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.821226 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bhxnh\"" Apr 16 14:01:15.821521 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.821502 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:01:15.821607 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.821523 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:01:15.823163 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.823129 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 14:01:15.827382 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.827359 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n"] Apr 16 14:01:15.827488 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.827397 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-mqmmd"] Apr 16 14:01:15.827488 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.827417 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 14:01:15.840125 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.840102 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx"] Apr 16 14:01:15.898909 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.898869 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrbz\" (UniqueName: \"kubernetes.io/projected/93bf1779-6f22-4509-a332-64a1d071a5a0-kube-api-access-mlrbz\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wv92n\" (UID: \"93bf1779-6f22-4509-a332-64a1d071a5a0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:15.899075 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.898920 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-snapshots\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.899075 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.898950 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93bf1779-6f22-4509-a332-64a1d071a5a0-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wv92n\" (UID: \"93bf1779-6f22-4509-a332-64a1d071a5a0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:15.899075 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.898981 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:15.899075 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899005 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqzjs\" (UniqueName: \"kubernetes.io/projected/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-kube-api-access-sqzjs\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:15.899075 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899036 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-tmp\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.899075 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899060 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:15.899442 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899105 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.899442 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899207 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.899442 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899304 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49030659-7d98-49ee-844f-41ff4d22d449-config\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:15.899442 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899348 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-799pg\" (UniqueName: \"kubernetes.io/projected/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-kube-api-access-799pg\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.899442 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93bf1779-6f22-4509-a332-64a1d071a5a0-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wv92n\" (UID: \"93bf1779-6f22-4509-a332-64a1d071a5a0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:15.899655 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49030659-7d98-49ee-844f-41ff4d22d449-serving-cert\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:15.899655 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899476 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcfjs\" (UniqueName: \"kubernetes.io/projected/49030659-7d98-49ee-844f-41ff4d22d449-kube-api-access-pcfjs\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:15.899655 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899509 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-serving-cert\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.899655 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899537 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49030659-7d98-49ee-844f-41ff4d22d449-trusted-ca\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:15.899655 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899639 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-tmp\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.899819 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899698 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-snapshots\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.899819 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.899724 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.900058 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.900037 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.901970 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.901952 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-serving-cert\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.909996 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.909952 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm"] Apr 16 14:01:15.919970 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.919939 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-799pg\" (UniqueName: \"kubernetes.io/projected/bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38-kube-api-access-799pg\") pod \"insights-operator-5785d4fcdd-24gbs\" (UID: \"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38\") " pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:15.922600 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.922577 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf"] Apr 16 14:01:15.922759 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.922730 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm" Apr 16 14:01:15.925855 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.925832 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-qzx84\"" Apr 16 14:01:15.931694 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.931666 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69db645876-4xdsr"] Apr 16 14:01:15.931831 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.931812 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:15.934763 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.934733 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 14:01:15.934903 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.934740 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-2dv8z\"" Apr 16 14:01:15.934903 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.934740 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:01:15.934903 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.934865 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 14:01:15.941948 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.941927 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm"] Apr 16 14:01:15.941948 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.941949 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf"] Apr 16 14:01:15.942098 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.941959 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69db645876-4xdsr"] Apr 16 14:01:15.942098 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.942057 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:15.944657 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.944633 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jfbd2\"" Apr 16 14:01:15.944771 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.944632 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:01:15.944771 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.944633 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:01:15.944771 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.944734 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:01:15.950844 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:15.950823 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:01:16.000213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000115 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93bf1779-6f22-4509-a332-64a1d071a5a0-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wv92n\" (UID: \"93bf1779-6f22-4509-a332-64a1d071a5a0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:16.000213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000155 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:16.000213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000181 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:16.000213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000209 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-certificates\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.000558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000245 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93bf1779-6f22-4509-a332-64a1d071a5a0-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wv92n\" (UID: \"93bf1779-6f22-4509-a332-64a1d071a5a0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:16.000558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000384 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49030659-7d98-49ee-844f-41ff4d22d449-config\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:16.000558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000423 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.000558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000443 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g86x6\" (UniqueName: \"kubernetes.io/projected/60b6666f-de32-4f0e-a9b0-cf858767b237-kube-api-access-g86x6\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:16.000558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000476 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrl8f\" (UniqueName: \"kubernetes.io/projected/63d5ca74-da50-429d-abfb-de1dfd0f7646-kube-api-access-lrl8f\") pod \"network-check-source-7b678d77c7-cwrlm\" (UID: \"63d5ca74-da50-429d-abfb-de1dfd0f7646\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm" Apr 16 14:01:16.000558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000498 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-installation-pull-secrets\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.000558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000528 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqzjs\" (UniqueName: \"kubernetes.io/projected/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-kube-api-access-sqzjs\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000562 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000618 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49030659-7d98-49ee-844f-41ff4d22d449-serving-cert\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000643 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcfjs\" (UniqueName: \"kubernetes.io/projected/49030659-7d98-49ee-844f-41ff4d22d449-kube-api-access-pcfjs\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000682 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49030659-7d98-49ee-844f-41ff4d22d449-trusted-ca\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000711 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrsv7\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-kube-api-access-hrsv7\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.000730 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000743 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94f41185-ca08-41f9-bc9f-f22802de6d09-ca-trust-extracted\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000766 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-bound-sa-token\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000793 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlrbz\" (UniqueName: \"kubernetes.io/projected/93bf1779-6f22-4509-a332-64a1d071a5a0-kube-api-access-mlrbz\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wv92n\" (UID: \"93bf1779-6f22-4509-a332-64a1d071a5a0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000816 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-image-registry-private-configuration\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.000885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000850 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-trusted-ca\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.001404 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.000929 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls podName:d9bf088f-26e6-41c0-bdc3-ea00c62c1255 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:16.500911105 +0000 UTC m=+99.937076817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ppndx" (UID: "d9bf088f-26e6-41c0-bdc3-ea00c62c1255") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:16.001404 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.000941 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93bf1779-6f22-4509-a332-64a1d071a5a0-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wv92n\" (UID: \"93bf1779-6f22-4509-a332-64a1d071a5a0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:16.001404 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.001041 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:16.001404 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.001242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49030659-7d98-49ee-844f-41ff4d22d449-config\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:16.001684 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.001667 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49030659-7d98-49ee-844f-41ff4d22d449-trusted-ca\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:16.002716 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.002696 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93bf1779-6f22-4509-a332-64a1d071a5a0-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wv92n\" (UID: \"93bf1779-6f22-4509-a332-64a1d071a5a0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:16.003293 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.003255 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49030659-7d98-49ee-844f-41ff4d22d449-serving-cert\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:16.016079 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.016042 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqzjs\" (UniqueName: \"kubernetes.io/projected/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-kube-api-access-sqzjs\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:16.016398 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.016376 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlrbz\" (UniqueName: \"kubernetes.io/projected/93bf1779-6f22-4509-a332-64a1d071a5a0-kube-api-access-mlrbz\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wv92n\" (UID: \"93bf1779-6f22-4509-a332-64a1d071a5a0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:16.016476 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.016394 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcfjs\" (UniqueName: \"kubernetes.io/projected/49030659-7d98-49ee-844f-41ff4d22d449-kube-api-access-pcfjs\") pod \"console-operator-d87b8d5fc-mqmmd\" (UID: \"49030659-7d98-49ee-844f-41ff4d22d449\") " pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:16.018221 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.018206 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" Apr 16 14:01:16.101835 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.101792 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94f41185-ca08-41f9-bc9f-f22802de6d09-ca-trust-extracted\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.101993 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.101841 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-bound-sa-token\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.101993 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.101871 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-image-registry-private-configuration\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.101993 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.101905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-trusted-ca\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.101993 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.101938 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:16.102203 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.102036 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:01:16.102203 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.102108 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls podName:60b6666f-de32-4f0e-a9b0-cf858767b237 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:16.60208793 +0000 UTC m=+100.038253642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls") pod "cluster-samples-operator-667775844f-gpwrf" (UID: "60b6666f-de32-4f0e-a9b0-cf858767b237") : secret "samples-operator-tls" not found Apr 16 14:01:16.102372 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.102262 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94f41185-ca08-41f9-bc9f-f22802de6d09-ca-trust-extracted\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.102372 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.102302 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-certificates\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.102372 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.102373 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.102558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.102405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g86x6\" (UniqueName: \"kubernetes.io/projected/60b6666f-de32-4f0e-a9b0-cf858767b237-kube-api-access-g86x6\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:16.102558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.102448 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrl8f\" (UniqueName: \"kubernetes.io/projected/63d5ca74-da50-429d-abfb-de1dfd0f7646-kube-api-access-lrl8f\") pod \"network-check-source-7b678d77c7-cwrlm\" (UID: \"63d5ca74-da50-429d-abfb-de1dfd0f7646\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm" Apr 16 14:01:16.102558 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.102474 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:16.102558 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.102494 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69db645876-4xdsr: secret "image-registry-tls" not found Apr 16 14:01:16.102558 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.102546 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls podName:94f41185-ca08-41f9-bc9f-f22802de6d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:16.602529334 +0000 UTC m=+100.038695042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls") pod "image-registry-69db645876-4xdsr" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09") : secret "image-registry-tls" not found Apr 16 14:01:16.102837 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.102478 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-installation-pull-secrets\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.102837 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.102676 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrsv7\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-kube-api-access-hrsv7\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.103479 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.103453 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-certificates\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.103657 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.103638 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-trusted-ca\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.104795 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.104776 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-installation-pull-secrets\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.104904 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.104867 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-image-registry-private-configuration\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.114439 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.114407 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-bound-sa-token\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.114869 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.114824 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g86x6\" (UniqueName: \"kubernetes.io/projected/60b6666f-de32-4f0e-a9b0-cf858767b237-kube-api-access-g86x6\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:16.115402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.115382 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrsv7\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-kube-api-access-hrsv7\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.115497 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.115402 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrl8f\" (UniqueName: \"kubernetes.io/projected/63d5ca74-da50-429d-abfb-de1dfd0f7646-kube-api-access-lrl8f\") pod \"network-check-source-7b678d77c7-cwrlm\" (UID: \"63d5ca74-da50-429d-abfb-de1dfd0f7646\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm" Apr 16 14:01:16.124393 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.124369 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" Apr 16 14:01:16.133096 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.133074 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:16.140216 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.140183 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-24gbs"] Apr 16 14:01:16.143703 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:16.143663 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb1e49f0_eb8d_46d6_a5c8_7bfad3071e38.slice/crio-a423fe2560c63fc939a768c554fc760a1b1d5e89d77f9e9e5da3f82250de8ded WatchSource:0}: Error finding container a423fe2560c63fc939a768c554fc760a1b1d5e89d77f9e9e5da3f82250de8ded: Status 404 returned error can't find the container with id a423fe2560c63fc939a768c554fc760a1b1d5e89d77f9e9e5da3f82250de8ded Apr 16 14:01:16.233549 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.233513 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm" Apr 16 14:01:16.255885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.255844 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n"] Apr 16 14:01:16.259112 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:16.259084 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93bf1779_6f22_4509_a332_64a1d071a5a0.slice/crio-4062a55e712757db6f6b76b043093b9bf00548b3b9f0e83fd6e60a3ac1dbef63 WatchSource:0}: Error finding container 4062a55e712757db6f6b76b043093b9bf00548b3b9f0e83fd6e60a3ac1dbef63: Status 404 returned error can't find the container with id 4062a55e712757db6f6b76b043093b9bf00548b3b9f0e83fd6e60a3ac1dbef63 Apr 16 14:01:16.273595 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.273560 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-mqmmd"] Apr 16 14:01:16.277618 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:16.277589 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49030659_7d98_49ee_844f_41ff4d22d449.slice/crio-0abb5b79eb11a8862ce55e4103a3631eb082e6533798fdd35bf5de7cecb8fb7e WatchSource:0}: Error finding container 0abb5b79eb11a8862ce55e4103a3631eb082e6533798fdd35bf5de7cecb8fb7e: Status 404 returned error can't find the container with id 0abb5b79eb11a8862ce55e4103a3631eb082e6533798fdd35bf5de7cecb8fb7e Apr 16 14:01:16.361879 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.361842 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm"] Apr 16 14:01:16.370438 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:16.370393 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63d5ca74_da50_429d_abfb_de1dfd0f7646.slice/crio-328c1d30239aff9921f6b568977fee5f3ed49bc47c2b46bb62ba515b8212f019 WatchSource:0}: Error finding container 328c1d30239aff9921f6b568977fee5f3ed49bc47c2b46bb62ba515b8212f019: Status 404 returned error can't find the container with id 328c1d30239aff9921f6b568977fee5f3ed49bc47c2b46bb62ba515b8212f019 Apr 16 14:01:16.418548 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.418518 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" event={"ID":"49030659-7d98-49ee-844f-41ff4d22d449","Type":"ContainerStarted","Data":"0abb5b79eb11a8862ce55e4103a3631eb082e6533798fdd35bf5de7cecb8fb7e"} Apr 16 14:01:16.419512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.419485 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" event={"ID":"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38","Type":"ContainerStarted","Data":"a423fe2560c63fc939a768c554fc760a1b1d5e89d77f9e9e5da3f82250de8ded"} Apr 16 14:01:16.420414 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.420391 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm" event={"ID":"63d5ca74-da50-429d-abfb-de1dfd0f7646","Type":"ContainerStarted","Data":"328c1d30239aff9921f6b568977fee5f3ed49bc47c2b46bb62ba515b8212f019"} Apr 16 14:01:16.421325 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.421305 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" event={"ID":"93bf1779-6f22-4509-a332-64a1d071a5a0","Type":"ContainerStarted","Data":"4062a55e712757db6f6b76b043093b9bf00548b3b9f0e83fd6e60a3ac1dbef63"} Apr 16 14:01:16.506307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.506184 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:16.506447 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.506366 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:16.506447 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.506434 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls podName:d9bf088f-26e6-41c0-bdc3-ea00c62c1255 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:17.506418281 +0000 UTC m=+100.942583991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ppndx" (UID: "d9bf088f-26e6-41c0-bdc3-ea00c62c1255") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:16.607682 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.607639 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:16.607682 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:16.607688 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:16.607933 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.607781 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:16.607933 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.607781 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:01:16.607933 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.607792 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69db645876-4xdsr: secret "image-registry-tls" not found Apr 16 14:01:16.607933 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.607846 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls podName:94f41185-ca08-41f9-bc9f-f22802de6d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:17.60783378 +0000 UTC m=+101.043999486 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls") pod "image-registry-69db645876-4xdsr" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09") : secret "image-registry-tls" not found Apr 16 14:01:16.607933 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:16.607860 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls podName:60b6666f-de32-4f0e-a9b0-cf858767b237 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:17.607854098 +0000 UTC m=+101.044019804 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls") pod "cluster-samples-operator-667775844f-gpwrf" (UID: "60b6666f-de32-4f0e-a9b0-cf858767b237") : secret "samples-operator-tls" not found Apr 16 14:01:17.427429 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:17.427389 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm" event={"ID":"63d5ca74-da50-429d-abfb-de1dfd0f7646","Type":"ContainerStarted","Data":"bb0ec34cf7c10b4b1b9cf874abfb89ff76c1a1a724c2e02d59cb08579ef65003"} Apr 16 14:01:17.444832 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:17.444778 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-cwrlm" podStartSLOduration=2.444760404 podStartE2EDuration="2.444760404s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:17.443816119 +0000 UTC m=+100.879981847" watchObservedRunningTime="2026-04-16 14:01:17.444760404 +0000 UTC m=+100.880926134" Apr 16 14:01:17.521311 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:17.521247 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:17.521486 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:17.521444 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:17.521541 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:17.521506 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls podName:d9bf088f-26e6-41c0-bdc3-ea00c62c1255 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:19.521491426 +0000 UTC m=+102.957657136 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ppndx" (UID: "d9bf088f-26e6-41c0-bdc3-ea00c62c1255") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:17.622399 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:17.622354 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:17.622586 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:17.622444 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:17.622586 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:17.622525 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:01:17.622586 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:17.622545 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:17.622586 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:17.622558 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69db645876-4xdsr: secret "image-registry-tls" not found Apr 16 14:01:17.622863 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:17.622606 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls podName:60b6666f-de32-4f0e-a9b0-cf858767b237 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:19.622585796 +0000 UTC m=+103.058751506 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls") pod "cluster-samples-operator-667775844f-gpwrf" (UID: "60b6666f-de32-4f0e-a9b0-cf858767b237") : secret "samples-operator-tls" not found Apr 16 14:01:17.622863 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:17.622626 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls podName:94f41185-ca08-41f9-bc9f-f22802de6d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:19.622618793 +0000 UTC m=+103.058784500 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls") pod "image-registry-69db645876-4xdsr" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09") : secret "image-registry-tls" not found Apr 16 14:01:18.368023 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:18.367995 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zg9zc" Apr 16 14:01:19.434802 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:19.434769 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/0.log" Apr 16 14:01:19.435223 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:19.434819 2580 generic.go:358] "Generic (PLEG): container finished" podID="49030659-7d98-49ee-844f-41ff4d22d449" containerID="a598556869b465782224a745e28f23374b760b2ae6a6137dd515978b18c6241f" exitCode=255 Apr 16 14:01:19.435223 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:19.434938 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" event={"ID":"49030659-7d98-49ee-844f-41ff4d22d449","Type":"ContainerDied","Data":"a598556869b465782224a745e28f23374b760b2ae6a6137dd515978b18c6241f"} Apr 16 14:01:19.435223 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:19.435195 2580 scope.go:117] "RemoveContainer" containerID="a598556869b465782224a745e28f23374b760b2ae6a6137dd515978b18c6241f" Apr 16 14:01:19.436489 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:19.436458 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" event={"ID":"93bf1779-6f22-4509-a332-64a1d071a5a0","Type":"ContainerStarted","Data":"5181e2dd55d88588bdd619d0622580d42cddcfcdf80e9d2a7b0148c531f2de52"} Apr 16 14:01:19.473108 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:19.473057 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" podStartSLOduration=1.98253432 podStartE2EDuration="4.473042944s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="2026-04-16 14:01:16.264419585 +0000 UTC m=+99.700585294" lastFinishedPulling="2026-04-16 14:01:18.754928212 +0000 UTC m=+102.191093918" observedRunningTime="2026-04-16 14:01:19.472139623 +0000 UTC m=+102.908305351" watchObservedRunningTime="2026-04-16 14:01:19.473042944 +0000 UTC m=+102.909208699" Apr 16 14:01:19.539456 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:19.539412 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:19.539734 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:19.539691 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:19.539802 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:19.539752 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls podName:d9bf088f-26e6-41c0-bdc3-ea00c62c1255 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:23.539732509 +0000 UTC m=+106.975898224 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ppndx" (UID: "d9bf088f-26e6-41c0-bdc3-ea00c62c1255") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:19.640679 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:19.640642 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:19.640880 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:19.640703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:19.640880 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:19.640788 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:01:19.640880 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:19.640854 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:19.640880 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:19.640864 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls podName:60b6666f-de32-4f0e-a9b0-cf858767b237 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:23.640843032 +0000 UTC m=+107.077008755 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls") pod "cluster-samples-operator-667775844f-gpwrf" (UID: "60b6666f-de32-4f0e-a9b0-cf858767b237") : secret "samples-operator-tls" not found Apr 16 14:01:19.640880 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:19.640869 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69db645876-4xdsr: secret "image-registry-tls" not found Apr 16 14:01:19.641200 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:19.640900 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls podName:94f41185-ca08-41f9-bc9f-f22802de6d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:23.640891774 +0000 UTC m=+107.077057480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls") pod "image-registry-69db645876-4xdsr" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09") : secret "image-registry-tls" not found Apr 16 14:01:20.439926 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:20.439899 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:01:20.440383 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:20.440291 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/0.log" Apr 16 14:01:20.440383 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:20.440332 2580 generic.go:358] "Generic (PLEG): container finished" podID="49030659-7d98-49ee-844f-41ff4d22d449" containerID="239a51c3f711212802f16e7dc59e044220668a5a9a44310279503a3ba3ebe839" exitCode=255 Apr 16 14:01:20.440502 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:20.440413 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" event={"ID":"49030659-7d98-49ee-844f-41ff4d22d449","Type":"ContainerDied","Data":"239a51c3f711212802f16e7dc59e044220668a5a9a44310279503a3ba3ebe839"} Apr 16 14:01:20.440502 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:20.440460 2580 scope.go:117] "RemoveContainer" containerID="a598556869b465782224a745e28f23374b760b2ae6a6137dd515978b18c6241f" Apr 16 14:01:20.440707 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:20.440686 2580 scope.go:117] "RemoveContainer" containerID="239a51c3f711212802f16e7dc59e044220668a5a9a44310279503a3ba3ebe839" Apr 16 14:01:20.440980 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:20.440960 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-mqmmd_openshift-console-operator(49030659-7d98-49ee-844f-41ff4d22d449)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" podUID="49030659-7d98-49ee-844f-41ff4d22d449" Apr 16 14:01:20.441775 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:20.441753 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" event={"ID":"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38","Type":"ContainerStarted","Data":"64ee35c5da680c46692375cb2aedd3b4961ebab998a71187935e73c2aecc0e21"} Apr 16 14:01:20.501196 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:20.501141 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" podStartSLOduration=2.07766246 podStartE2EDuration="5.501127166s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="2026-04-16 14:01:16.1459118 +0000 UTC m=+99.582077509" lastFinishedPulling="2026-04-16 14:01:19.569376495 +0000 UTC m=+103.005542215" observedRunningTime="2026-04-16 14:01:20.500452628 +0000 UTC m=+103.936618379" watchObservedRunningTime="2026-04-16 14:01:20.501127166 +0000 UTC m=+103.937292893" Apr 16 14:01:21.445107 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:21.445080 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:01:21.445569 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:21.445552 2580 scope.go:117] "RemoveContainer" containerID="239a51c3f711212802f16e7dc59e044220668a5a9a44310279503a3ba3ebe839" Apr 16 14:01:21.445748 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:21.445731 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-mqmmd_openshift-console-operator(49030659-7d98-49ee-844f-41ff4d22d449)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" podUID="49030659-7d98-49ee-844f-41ff4d22d449" Apr 16 14:01:23.570235 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:23.570196 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:23.570659 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:23.570362 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:23.570659 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:23.570431 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls podName:d9bf088f-26e6-41c0-bdc3-ea00c62c1255 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:31.570413901 +0000 UTC m=+115.006579611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ppndx" (UID: "d9bf088f-26e6-41c0-bdc3-ea00c62c1255") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:23.670828 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:23.670786 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:23.670828 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:23.670834 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:23.671034 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:23.670930 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:01:23.671034 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:23.670992 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls podName:60b6666f-de32-4f0e-a9b0-cf858767b237 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:31.670976777 +0000 UTC m=+115.107142482 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls") pod "cluster-samples-operator-667775844f-gpwrf" (UID: "60b6666f-de32-4f0e-a9b0-cf858767b237") : secret "samples-operator-tls" not found Apr 16 14:01:23.671034 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:23.670933 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:23.671034 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:23.671035 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69db645876-4xdsr: secret "image-registry-tls" not found Apr 16 14:01:23.671172 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:23.671082 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls podName:94f41185-ca08-41f9-bc9f-f22802de6d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:31.671070036 +0000 UTC m=+115.107235742 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls") pod "image-registry-69db645876-4xdsr" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09") : secret "image-registry-tls" not found Apr 16 14:01:24.318694 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:24.318670 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-thzkh_f2a769c2-0080-45a0-983a-5c1bcf200faf/dns-node-resolver/0.log" Apr 16 14:01:25.312740 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:25.312716 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xwnfr_e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd/node-ca/0.log" Apr 16 14:01:26.134055 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:26.134022 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:26.134055 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:26.134058 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:26.134417 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:26.134405 2580 scope.go:117] "RemoveContainer" containerID="239a51c3f711212802f16e7dc59e044220668a5a9a44310279503a3ba3ebe839" Apr 16 14:01:26.134584 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:01:26.134570 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-mqmmd_openshift-console-operator(49030659-7d98-49ee-844f-41ff4d22d449)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" podUID="49030659-7d98-49ee-844f-41ff4d22d449" Apr 16 14:01:26.714955 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:26.714928 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-wv92n_93bf1779-6f22-4509-a332-64a1d071a5a0/kube-storage-version-migrator-operator/0.log" Apr 16 14:01:31.633451 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.633409 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:31.635988 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.635963 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9bf088f-26e6-41c0-bdc3-ea00c62c1255-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ppndx\" (UID: \"d9bf088f-26e6-41c0-bdc3-ea00c62c1255\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:31.734184 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.734148 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:31.734359 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.734193 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:31.736707 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.736675 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/60b6666f-de32-4f0e-a9b0-cf858767b237-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gpwrf\" (UID: \"60b6666f-de32-4f0e-a9b0-cf858767b237\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:31.736822 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.736726 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls\") pod \"image-registry-69db645876-4xdsr\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:31.738578 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.738563 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" Apr 16 14:01:31.842000 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.841969 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" Apr 16 14:01:31.851817 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.851790 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:31.856365 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.856331 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx"] Apr 16 14:01:31.858960 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:31.858839 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9bf088f_26e6_41c0_bdc3_ea00c62c1255.slice/crio-42813017b2987402daff1b446a9c32bfd9a8b546ea5f9f22be1fd2fe1c01d181 WatchSource:0}: Error finding container 42813017b2987402daff1b446a9c32bfd9a8b546ea5f9f22be1fd2fe1c01d181: Status 404 returned error can't find the container with id 42813017b2987402daff1b446a9c32bfd9a8b546ea5f9f22be1fd2fe1c01d181 Apr 16 14:01:31.990254 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:31.990223 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf"] Apr 16 14:01:32.010122 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:32.010085 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69db645876-4xdsr"] Apr 16 14:01:32.013406 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:32.013372 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94f41185_ca08_41f9_bc9f_f22802de6d09.slice/crio-4c9c4f0b0c785341946ba453c7112c4173d02a0a8392416a54ed5a9fc6bfbe0d WatchSource:0}: Error finding container 4c9c4f0b0c785341946ba453c7112c4173d02a0a8392416a54ed5a9fc6bfbe0d: Status 404 returned error can't find the container with id 4c9c4f0b0c785341946ba453c7112c4173d02a0a8392416a54ed5a9fc6bfbe0d Apr 16 14:01:32.469909 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:32.469877 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" event={"ID":"60b6666f-de32-4f0e-a9b0-cf858767b237","Type":"ContainerStarted","Data":"3eee06e13d4a26f050af7d55b0d5648f63aaffb482d5408a7b8e786bef26a8b4"} Apr 16 14:01:32.471071 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:32.471044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69db645876-4xdsr" event={"ID":"94f41185-ca08-41f9-bc9f-f22802de6d09","Type":"ContainerStarted","Data":"be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e"} Apr 16 14:01:32.471197 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:32.471080 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69db645876-4xdsr" event={"ID":"94f41185-ca08-41f9-bc9f-f22802de6d09","Type":"ContainerStarted","Data":"4c9c4f0b0c785341946ba453c7112c4173d02a0a8392416a54ed5a9fc6bfbe0d"} Apr 16 14:01:32.471253 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:32.471242 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:01:32.472052 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:32.472032 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" event={"ID":"d9bf088f-26e6-41c0-bdc3-ea00c62c1255","Type":"ContainerStarted","Data":"42813017b2987402daff1b446a9c32bfd9a8b546ea5f9f22be1fd2fe1c01d181"} Apr 16 14:01:32.493979 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:32.493931 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69db645876-4xdsr" podStartSLOduration=17.493919457 podStartE2EDuration="17.493919457s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:32.493205537 +0000 UTC m=+115.929371265" watchObservedRunningTime="2026-04-16 14:01:32.493919457 +0000 UTC m=+115.930085182" Apr 16 14:01:34.478701 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:34.478656 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" event={"ID":"d9bf088f-26e6-41c0-bdc3-ea00c62c1255","Type":"ContainerStarted","Data":"97dc65984cd36be66b63dfce90968bd721168b6f03de75705656c5aaf439bce2"} Apr 16 14:01:34.496220 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:34.496158 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ppndx" podStartSLOduration=17.428150806 podStartE2EDuration="19.496140409s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="2026-04-16 14:01:31.860679941 +0000 UTC m=+115.296845650" lastFinishedPulling="2026-04-16 14:01:33.928669547 +0000 UTC m=+117.364835253" observedRunningTime="2026-04-16 14:01:34.494617882 +0000 UTC m=+117.930783609" watchObservedRunningTime="2026-04-16 14:01:34.496140409 +0000 UTC m=+117.932306140" Apr 16 14:01:35.483197 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:35.483107 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" event={"ID":"60b6666f-de32-4f0e-a9b0-cf858767b237","Type":"ContainerStarted","Data":"eac7056a2b2ad8caf65347e35f91b513c00e07330ba6ad9849594076315a3812"} Apr 16 14:01:35.483197 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:35.483152 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" event={"ID":"60b6666f-de32-4f0e-a9b0-cf858767b237","Type":"ContainerStarted","Data":"cb1274f74293744a6d27bcb1003ab0ae771a97b31aa88625f8c4c3357a8ee5ca"} Apr 16 14:01:35.501096 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:35.501045 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gpwrf" podStartSLOduration=17.387950604 podStartE2EDuration="20.501029362s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="2026-04-16 14:01:32.024318929 +0000 UTC m=+115.460484635" lastFinishedPulling="2026-04-16 14:01:35.137397673 +0000 UTC m=+118.573563393" observedRunningTime="2026-04-16 14:01:35.500168021 +0000 UTC m=+118.936333750" watchObservedRunningTime="2026-04-16 14:01:35.501029362 +0000 UTC m=+118.937195130" Apr 16 14:01:40.148415 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:40.148383 2580 scope.go:117] "RemoveContainer" containerID="239a51c3f711212802f16e7dc59e044220668a5a9a44310279503a3ba3ebe839" Apr 16 14:01:40.496161 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:40.496085 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:01:40.496161 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:40.496142 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" event={"ID":"49030659-7d98-49ee-844f-41ff4d22d449","Type":"ContainerStarted","Data":"ef931b4d5715bff231ba4386c47d37fafe05fea1e135081ff1a317b322d12611"} Apr 16 14:01:40.496433 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:40.496409 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:40.514669 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:40.514619 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" podStartSLOduration=23.042558125 podStartE2EDuration="25.514604377s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="2026-04-16 14:01:16.279878703 +0000 UTC m=+99.716044409" lastFinishedPulling="2026-04-16 14:01:18.751924939 +0000 UTC m=+102.188090661" observedRunningTime="2026-04-16 14:01:40.513659259 +0000 UTC m=+123.949824989" watchObservedRunningTime="2026-04-16 14:01:40.514604377 +0000 UTC m=+123.950770140" Apr 16 14:01:40.638039 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:40.638012 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-mqmmd" Apr 16 14:01:45.198083 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.198036 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-n7x2g"] Apr 16 14:01:45.199924 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.199908 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.207164 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.207141 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fhg6t\"" Apr 16 14:01:45.207164 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.207159 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:01:45.207335 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.207159 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:01:45.214476 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.214449 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n7x2g"] Apr 16 14:01:45.274192 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.274165 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69db645876-4xdsr"] Apr 16 14:01:45.312241 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.312207 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-x7nxr"] Apr 16 14:01:45.313950 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.313935 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-54bd994b84-872wz"] Apr 16 14:01:45.314093 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.314074 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-x7nxr" Apr 16 14:01:45.316281 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.316251 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.320879 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.320858 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-wd7ml\"" Apr 16 14:01:45.321347 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.321328 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:01:45.331228 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.331210 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:01:45.340770 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.340740 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-data-volume\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.340898 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.340820 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz7rh\" (UniqueName: \"kubernetes.io/projected/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-kube-api-access-jz7rh\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.340954 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.340914 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.341005 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.340953 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.341005 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.340990 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-crio-socket\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.345885 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.345853 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-x7nxr"] Apr 16 14:01:45.358213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.358182 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54bd994b84-872wz"] Apr 16 14:01:45.441941 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.441907 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-installation-pull-secrets\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.441941 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.441952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-data-volume\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.442202 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.441982 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jz7rh\" (UniqueName: \"kubernetes.io/projected/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-kube-api-access-jz7rh\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.442202 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442024 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l5cb\" (UniqueName: \"kubernetes.io/projected/6811b550-4e43-4c2d-a27b-a962e50de90d-kube-api-access-8l5cb\") pod \"downloads-586b57c7b4-x7nxr\" (UID: \"6811b550-4e43-4c2d-a27b-a962e50de90d\") " pod="openshift-console/downloads-586b57c7b4-x7nxr" Apr 16 14:01:45.442202 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442046 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-registry-tls\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.442202 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442073 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.442202 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442091 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.442202 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442108 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-bound-sa-token\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.442202 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442188 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-crio-socket\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.442544 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-ca-trust-extracted\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.442544 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442293 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-crio-socket\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.442544 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442309 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-registry-certificates\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.442544 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442337 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-data-volume\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.442544 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442357 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj2t6\" (UniqueName: \"kubernetes.io/projected/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-kube-api-access-zj2t6\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.442544 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442395 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-image-registry-private-configuration\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.442544 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442426 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-trusted-ca\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.442784 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.442614 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.445114 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.445084 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.479804 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.479727 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz7rh\" (UniqueName: \"kubernetes.io/projected/25e0ee69-7f8f-410c-a4db-ac2b4e7e494b-kube-api-access-jz7rh\") pod \"insights-runtime-extractor-n7x2g\" (UID: \"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b\") " pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.508228 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.508196 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n7x2g" Apr 16 14:01:45.543715 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.543680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj2t6\" (UniqueName: \"kubernetes.io/projected/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-kube-api-access-zj2t6\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.543715 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.543725 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-image-registry-private-configuration\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.543957 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.543748 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-trusted-ca\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.543957 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.543773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-installation-pull-secrets\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.543957 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.543830 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l5cb\" (UniqueName: \"kubernetes.io/projected/6811b550-4e43-4c2d-a27b-a962e50de90d-kube-api-access-8l5cb\") pod \"downloads-586b57c7b4-x7nxr\" (UID: \"6811b550-4e43-4c2d-a27b-a962e50de90d\") " pod="openshift-console/downloads-586b57c7b4-x7nxr" Apr 16 14:01:45.543957 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.543856 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-registry-tls\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.544153 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.544002 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-bound-sa-token\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.544153 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.544060 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-ca-trust-extracted\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.544153 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.544091 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-registry-certificates\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.544691 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.544514 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-ca-trust-extracted\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.544846 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.544819 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-trusted-ca\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.544982 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.544964 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-registry-certificates\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.546528 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.546490 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-installation-pull-secrets\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.546640 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.546615 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-registry-tls\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.546777 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.546762 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-image-registry-private-configuration\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.557997 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.557968 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj2t6\" (UniqueName: \"kubernetes.io/projected/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-kube-api-access-zj2t6\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.560490 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.560384 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l5cb\" (UniqueName: \"kubernetes.io/projected/6811b550-4e43-4c2d-a27b-a962e50de90d-kube-api-access-8l5cb\") pod \"downloads-586b57c7b4-x7nxr\" (UID: \"6811b550-4e43-4c2d-a27b-a962e50de90d\") " pod="openshift-console/downloads-586b57c7b4-x7nxr" Apr 16 14:01:45.564030 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.564006 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ba4c449-3beb-4e89-93c7-614b31fdfa9d-bound-sa-token\") pod \"image-registry-54bd994b84-872wz\" (UID: \"5ba4c449-3beb-4e89-93c7-614b31fdfa9d\") " pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.624579 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.624548 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-x7nxr" Apr 16 14:01:45.629311 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.629284 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:45.639160 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.639119 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n7x2g"] Apr 16 14:01:45.643184 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:45.643152 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25e0ee69_7f8f_410c_a4db_ac2b4e7e494b.slice/crio-2a94103c77751d42d0c7c15ec702b99633e082fd985cfff050364dd76a7a675c WatchSource:0}: Error finding container 2a94103c77751d42d0c7c15ec702b99633e082fd985cfff050364dd76a7a675c: Status 404 returned error can't find the container with id 2a94103c77751d42d0c7c15ec702b99633e082fd985cfff050364dd76a7a675c Apr 16 14:01:45.763177 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.763141 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-x7nxr"] Apr 16 14:01:45.766218 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:45.766187 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6811b550_4e43_4c2d_a27b_a962e50de90d.slice/crio-b4f8f6882e1d6ff8297e645dd103b414a1758ca0a2c2f042980003641ec83012 WatchSource:0}: Error finding container b4f8f6882e1d6ff8297e645dd103b414a1758ca0a2c2f042980003641ec83012: Status 404 returned error can't find the container with id b4f8f6882e1d6ff8297e645dd103b414a1758ca0a2c2f042980003641ec83012 Apr 16 14:01:45.795707 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:45.795676 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54bd994b84-872wz"] Apr 16 14:01:45.798951 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:45.798916 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba4c449_3beb_4e89_93c7_614b31fdfa9d.slice/crio-9e0815a4d8d0281f2ee53002acd232d1d8f5bfe9464be146fb200c77a8504431 WatchSource:0}: Error finding container 9e0815a4d8d0281f2ee53002acd232d1d8f5bfe9464be146fb200c77a8504431: Status 404 returned error can't find the container with id 9e0815a4d8d0281f2ee53002acd232d1d8f5bfe9464be146fb200c77a8504431 Apr 16 14:01:46.513333 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:46.513262 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-x7nxr" event={"ID":"6811b550-4e43-4c2d-a27b-a962e50de90d","Type":"ContainerStarted","Data":"b4f8f6882e1d6ff8297e645dd103b414a1758ca0a2c2f042980003641ec83012"} Apr 16 14:01:46.514787 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:46.514755 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n7x2g" event={"ID":"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b","Type":"ContainerStarted","Data":"6a5d7648374ebd8154a5e00d9a868b98c3e5dc69bc36c4f64b8573bfdf8ff99a"} Apr 16 14:01:46.514787 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:46.514784 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n7x2g" event={"ID":"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b","Type":"ContainerStarted","Data":"338067b9b6e3bdea7a63f4fdb965509a9ecc26acece408279e8d0c54ae276c6d"} Apr 16 14:01:46.514975 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:46.514794 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n7x2g" event={"ID":"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b","Type":"ContainerStarted","Data":"2a94103c77751d42d0c7c15ec702b99633e082fd985cfff050364dd76a7a675c"} Apr 16 14:01:46.515958 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:46.515937 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54bd994b84-872wz" event={"ID":"5ba4c449-3beb-4e89-93c7-614b31fdfa9d","Type":"ContainerStarted","Data":"c123d82bf46e2b8d5647fdfebee7b1f5975c5deeab3fe94608f5669612e1956d"} Apr 16 14:01:46.516011 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:46.515964 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54bd994b84-872wz" event={"ID":"5ba4c449-3beb-4e89-93c7-614b31fdfa9d","Type":"ContainerStarted","Data":"9e0815a4d8d0281f2ee53002acd232d1d8f5bfe9464be146fb200c77a8504431"} Apr 16 14:01:46.516090 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:46.516067 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:01:46.537229 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:46.537179 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-54bd994b84-872wz" podStartSLOduration=1.537162513 podStartE2EDuration="1.537162513s" podCreationTimestamp="2026-04-16 14:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:46.535543902 +0000 UTC m=+129.971709632" watchObservedRunningTime="2026-04-16 14:01:46.537162513 +0000 UTC m=+129.973328231" Apr 16 14:01:46.954048 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:46.954009 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:01:46.956716 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:46.956684 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aef30458-23ff-40ab-ad5a-ae58af58ca82-metrics-certs\") pod \"network-metrics-daemon-cptr8\" (UID: \"aef30458-23ff-40ab-ad5a-ae58af58ca82\") " pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:01:47.172164 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:47.172125 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2qtwk\"" Apr 16 14:01:47.180297 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:47.179848 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cptr8" Apr 16 14:01:47.334862 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:47.334687 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cptr8"] Apr 16 14:01:47.338177 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:47.338142 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef30458_23ff_40ab_ad5a_ae58af58ca82.slice/crio-f1ef342fbc9478dfa167076d6f8b940dc7b0ed68e6d446e3360d382071d2860f WatchSource:0}: Error finding container f1ef342fbc9478dfa167076d6f8b940dc7b0ed68e6d446e3360d382071d2860f: Status 404 returned error can't find the container with id f1ef342fbc9478dfa167076d6f8b940dc7b0ed68e6d446e3360d382071d2860f Apr 16 14:01:47.519649 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:47.519551 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cptr8" event={"ID":"aef30458-23ff-40ab-ad5a-ae58af58ca82","Type":"ContainerStarted","Data":"f1ef342fbc9478dfa167076d6f8b940dc7b0ed68e6d446e3360d382071d2860f"} Apr 16 14:01:48.528786 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:48.528739 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n7x2g" event={"ID":"25e0ee69-7f8f-410c-a4db-ac2b4e7e494b","Type":"ContainerStarted","Data":"7887a418a2dff5ceab806ed418c6c3f3ac8cbfc6933c0c204a446d253785ef59"} Apr 16 14:01:48.548436 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:48.548353 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-n7x2g" podStartSLOduration=1.198542803 podStartE2EDuration="3.548335491s" podCreationTimestamp="2026-04-16 14:01:45 +0000 UTC" firstStartedPulling="2026-04-16 14:01:45.716195665 +0000 UTC m=+129.152361385" lastFinishedPulling="2026-04-16 14:01:48.065988354 +0000 UTC m=+131.502154073" observedRunningTime="2026-04-16 14:01:48.54683462 +0000 UTC m=+131.983000378" watchObservedRunningTime="2026-04-16 14:01:48.548335491 +0000 UTC m=+131.984501220" Apr 16 14:01:49.534386 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:49.534346 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cptr8" event={"ID":"aef30458-23ff-40ab-ad5a-ae58af58ca82","Type":"ContainerStarted","Data":"ce0c123af68d29a7d37fe7c876358290c13143a5b924b282381a1512583124ff"} Apr 16 14:01:49.534386 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:49.534390 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cptr8" event={"ID":"aef30458-23ff-40ab-ad5a-ae58af58ca82","Type":"ContainerStarted","Data":"ba678a2a03e03c81d0a806f3ce728bc293f6ea2819fd8d4010ed34dcf6a87964"} Apr 16 14:01:49.552131 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:49.552068 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cptr8" podStartSLOduration=131.16834328 podStartE2EDuration="2m12.552047347s" podCreationTimestamp="2026-04-16 13:59:37 +0000 UTC" firstStartedPulling="2026-04-16 14:01:47.340833627 +0000 UTC m=+130.776999347" lastFinishedPulling="2026-04-16 14:01:48.724537706 +0000 UTC m=+132.160703414" observedRunningTime="2026-04-16 14:01:49.550087008 +0000 UTC m=+132.986252739" watchObservedRunningTime="2026-04-16 14:01:49.552047347 +0000 UTC m=+132.988213074" Apr 16 14:01:50.544196 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.544162 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66d9c4554f-v6wpk"] Apr 16 14:01:50.546336 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.546312 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.548958 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.548935 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:01:50.549801 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.549748 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:01:50.549801 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.549771 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:01:50.549801 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.549778 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:01:50.550031 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.549833 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:01:50.550031 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.549779 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vt2nm\"" Apr 16 14:01:50.555936 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.555899 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66d9c4554f-v6wpk"] Apr 16 14:01:50.689609 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.689575 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frhfk\" (UniqueName: \"kubernetes.io/projected/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-kube-api-access-frhfk\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.689797 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.689626 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-serving-cert\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.689797 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.689682 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-service-ca\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.689797 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.689713 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-oauth-config\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.689797 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.689760 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-oauth-serving-cert\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.690005 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.689823 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-config\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.791113 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.791077 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-service-ca\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.791318 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.791124 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-oauth-config\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.791318 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.791168 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-oauth-serving-cert\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.791318 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.791196 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-config\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.791318 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.791305 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frhfk\" (UniqueName: \"kubernetes.io/projected/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-kube-api-access-frhfk\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.791539 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.791345 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-serving-cert\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.791975 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.791946 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-service-ca\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.792094 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.791980 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-config\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.792094 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.792004 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-oauth-serving-cert\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.794135 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.794104 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-serving-cert\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.794254 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.794211 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-oauth-config\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.800895 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.800849 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frhfk\" (UniqueName: \"kubernetes.io/projected/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-kube-api-access-frhfk\") pod \"console-66d9c4554f-v6wpk\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:50.856650 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:50.856605 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:01:51.002347 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:51.002307 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66d9c4554f-v6wpk"] Apr 16 14:01:51.006001 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:51.005961 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfaddedb_dd86_4549_b03e_a26c6e8ffe8d.slice/crio-bb687a2482ee983918437d355f404945139ef96b7d1d5524cafb93eafc8967ac WatchSource:0}: Error finding container bb687a2482ee983918437d355f404945139ef96b7d1d5524cafb93eafc8967ac: Status 404 returned error can't find the container with id bb687a2482ee983918437d355f404945139ef96b7d1d5524cafb93eafc8967ac Apr 16 14:01:51.541630 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:51.541586 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66d9c4554f-v6wpk" event={"ID":"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d","Type":"ContainerStarted","Data":"bb687a2482ee983918437d355f404945139ef96b7d1d5524cafb93eafc8967ac"} Apr 16 14:01:52.869849 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.869801 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff"] Apr 16 14:01:52.888633 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.888604 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:52.891615 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.891575 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff"] Apr 16 14:01:52.891615 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.891601 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:01:52.892429 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.892405 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-vn745\"" Apr 16 14:01:52.892553 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.892527 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 14:01:52.892798 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.892778 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:52.896221 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.896194 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qkptn"] Apr 16 14:01:52.923311 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.923249 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:52.926119 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.925828 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:52.926499 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.926477 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:52.926736 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.926717 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:52.926922 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:52.926907 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qj4mz\"" Apr 16 14:01:53.016684 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.016637 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwj4c\" (UniqueName: \"kubernetes.io/projected/48c8c7d4-4455-4581-866d-80f4d5c04319-kube-api-access-kwj4c\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.016874 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.016700 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.016874 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.016733 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48c8c7d4-4455-4581-866d-80f4d5c04319-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.016874 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.016761 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7127a2d6-0603-465d-a090-ca81178ba98d-root\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.016874 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.016791 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-wtmp\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.017088 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.016887 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48c8c7d4-4455-4581-866d-80f4d5c04319-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.017088 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.016916 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-textfile\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.017088 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.017053 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7127a2d6-0603-465d-a090-ca81178ba98d-sys\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.017222 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.017091 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmwn\" (UniqueName: \"kubernetes.io/projected/7127a2d6-0603-465d-a090-ca81178ba98d-kube-api-access-5rmwn\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.017222 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.017135 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/48c8c7d4-4455-4581-866d-80f4d5c04319-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.017222 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.017194 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-accelerators-collector-config\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.017400 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.017289 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7127a2d6-0603-465d-a090-ca81178ba98d-metrics-client-ca\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.017400 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.017344 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-tls\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.118492 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118455 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48c8c7d4-4455-4581-866d-80f4d5c04319-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118508 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-textfile\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118608 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7127a2d6-0603-465d-a090-ca81178ba98d-sys\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118637 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmwn\" (UniqueName: \"kubernetes.io/projected/7127a2d6-0603-465d-a090-ca81178ba98d-kube-api-access-5rmwn\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118678 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/48c8c7d4-4455-4581-866d-80f4d5c04319-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118717 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-accelerators-collector-config\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118756 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7127a2d6-0603-465d-a090-ca81178ba98d-metrics-client-ca\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118798 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-tls\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118831 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwj4c\" (UniqueName: \"kubernetes.io/projected/48c8c7d4-4455-4581-866d-80f4d5c04319-kube-api-access-kwj4c\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118862 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118891 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48c8c7d4-4455-4581-866d-80f4d5c04319-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118909 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-textfile\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118919 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7127a2d6-0603-465d-a090-ca81178ba98d-root\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118970 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7127a2d6-0603-465d-a090-ca81178ba98d-root\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.118982 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-wtmp\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119844 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.119136 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-wtmp\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119844 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.119685 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7127a2d6-0603-465d-a090-ca81178ba98d-metrics-client-ca\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.119844 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.119692 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48c8c7d4-4455-4581-866d-80f4d5c04319-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.119844 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.119745 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7127a2d6-0603-465d-a090-ca81178ba98d-sys\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.120611 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.120530 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-accelerators-collector-config\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.123330 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.122798 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-tls\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.123330 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.122831 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7127a2d6-0603-465d-a090-ca81178ba98d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.123330 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.123144 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/48c8c7d4-4455-4581-866d-80f4d5c04319-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.124373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.124350 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48c8c7d4-4455-4581-866d-80f4d5c04319-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.132935 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.132900 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwj4c\" (UniqueName: \"kubernetes.io/projected/48c8c7d4-4455-4581-866d-80f4d5c04319-kube-api-access-kwj4c\") pod \"openshift-state-metrics-5669946b84-gc9ff\" (UID: \"48c8c7d4-4455-4581-866d-80f4d5c04319\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.133459 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.133382 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmwn\" (UniqueName: \"kubernetes.io/projected/7127a2d6-0603-465d-a090-ca81178ba98d-kube-api-access-5rmwn\") pod \"node-exporter-qkptn\" (UID: \"7127a2d6-0603-465d-a090-ca81178ba98d\") " pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.202355 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.202318 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" Apr 16 14:01:53.236909 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.236637 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qkptn" Apr 16 14:01:53.258000 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:53.257957 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7127a2d6_0603_465d_a090_ca81178ba98d.slice/crio-919b354a72ac7bb8c79269543c95e4bb8cdcfd178cdfd26e8a10209f68731dbb WatchSource:0}: Error finding container 919b354a72ac7bb8c79269543c95e4bb8cdcfd178cdfd26e8a10209f68731dbb: Status 404 returned error can't find the container with id 919b354a72ac7bb8c79269543c95e4bb8cdcfd178cdfd26e8a10209f68731dbb Apr 16 14:01:53.367118 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.367074 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff"] Apr 16 14:01:53.375985 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:53.375912 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48c8c7d4_4455_4581_866d_80f4d5c04319.slice/crio-c69311bcfbb73a096cf5a74a5b4267bb88044fce4ee0b0f73531c3ee14f08c5b WatchSource:0}: Error finding container c69311bcfbb73a096cf5a74a5b4267bb88044fce4ee0b0f73531c3ee14f08c5b: Status 404 returned error can't find the container with id c69311bcfbb73a096cf5a74a5b4267bb88044fce4ee0b0f73531c3ee14f08c5b Apr 16 14:01:53.553055 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.553019 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qkptn" event={"ID":"7127a2d6-0603-465d-a090-ca81178ba98d","Type":"ContainerStarted","Data":"919b354a72ac7bb8c79269543c95e4bb8cdcfd178cdfd26e8a10209f68731dbb"} Apr 16 14:01:53.555316 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.555249 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" event={"ID":"48c8c7d4-4455-4581-866d-80f4d5c04319","Type":"ContainerStarted","Data":"01675344d07b63b50d4700b146c79ff0de189a1a5fcf836d66f2a696bfa8e86d"} Apr 16 14:01:53.555449 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:53.555318 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" event={"ID":"48c8c7d4-4455-4581-866d-80f4d5c04319","Type":"ContainerStarted","Data":"c69311bcfbb73a096cf5a74a5b4267bb88044fce4ee0b0f73531c3ee14f08c5b"} Apr 16 14:01:54.562240 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.562198 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" event={"ID":"48c8c7d4-4455-4581-866d-80f4d5c04319","Type":"ContainerStarted","Data":"bb5bc42199a86aa2b8e2d38aebd1f1e11b31f51fb71a71a26ede0538748468d1"} Apr 16 14:01:54.747443 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.746307 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dcd489fbf-rsjq7"] Apr 16 14:01:54.751103 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.751074 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.762779 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.762512 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dcd489fbf-rsjq7"] Apr 16 14:01:54.764388 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.764364 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:01:54.835213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.835129 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-config\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.835213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.835173 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-oauth-serving-cert\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.835475 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.835312 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-serving-cert\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.835475 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.835365 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-oauth-config\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.835475 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.835439 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tshrt\" (UniqueName: \"kubernetes.io/projected/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-kube-api-access-tshrt\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.835657 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.835522 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-trusted-ca-bundle\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.835657 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.835550 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-service-ca\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.936797 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.936755 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-oauth-config\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.936970 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.936829 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tshrt\" (UniqueName: \"kubernetes.io/projected/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-kube-api-access-tshrt\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.936970 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.936894 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-trusted-ca-bundle\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.936970 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.936921 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-service-ca\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.936970 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.936960 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-config\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.937138 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.936983 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-oauth-serving-cert\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.937138 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.937045 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-serving-cert\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.940486 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.938379 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-service-ca\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.940486 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.939006 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-config\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.940486 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.940339 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-oauth-config\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.940486 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.940425 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-serving-cert\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.940964 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.940939 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-trusted-ca-bundle\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.941573 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.941531 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-oauth-serving-cert\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:54.946717 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:54.946658 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tshrt\" (UniqueName: \"kubernetes.io/projected/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-kube-api-access-tshrt\") pod \"console-5dcd489fbf-rsjq7\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:55.070997 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.070959 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:01:55.280950 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.280840 2580 patch_prober.go:28] interesting pod/image-registry-69db645876-4xdsr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:55.281127 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.280925 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-69db645876-4xdsr" podUID="94f41185-ca08-41f9-bc9f-f22802de6d09" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:55.480002 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.479951 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dcd489fbf-rsjq7"] Apr 16 14:01:55.484296 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:55.484223 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae1941d9_4bee_4735_a3df_c001d4b0a8e5.slice/crio-a159615bf57f8e77898408fc014b74673b3b3d36fc024c7990a7f26152db9dd9 WatchSource:0}: Error finding container a159615bf57f8e77898408fc014b74673b3b3d36fc024c7990a7f26152db9dd9: Status 404 returned error can't find the container with id a159615bf57f8e77898408fc014b74673b3b3d36fc024c7990a7f26152db9dd9 Apr 16 14:01:55.568881 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.568843 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66d9c4554f-v6wpk" event={"ID":"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d","Type":"ContainerStarted","Data":"86bdf9170ab8e6006477d3d43670ba73ac222bae7f17fe6fab3f337220ef597d"} Apr 16 14:01:55.573445 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.573411 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qkptn" event={"ID":"7127a2d6-0603-465d-a090-ca81178ba98d","Type":"ContainerStarted","Data":"4b68f9e9667ffb4f733a997aa98a9ada2792714298e4fa50e77f1a7baa5ac11d"} Apr 16 14:01:55.577749 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.577001 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dcd489fbf-rsjq7" event={"ID":"ae1941d9-4bee-4735-a3df-c001d4b0a8e5","Type":"ContainerStarted","Data":"a159615bf57f8e77898408fc014b74673b3b3d36fc024c7990a7f26152db9dd9"} Apr 16 14:01:55.588907 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.588856 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66d9c4554f-v6wpk" podStartSLOduration=1.2709440889999999 podStartE2EDuration="5.588839333s" podCreationTimestamp="2026-04-16 14:01:50 +0000 UTC" firstStartedPulling="2026-04-16 14:01:51.00829904 +0000 UTC m=+134.444464745" lastFinishedPulling="2026-04-16 14:01:55.326194279 +0000 UTC m=+138.762359989" observedRunningTime="2026-04-16 14:01:55.587009998 +0000 UTC m=+139.023175727" watchObservedRunningTime="2026-04-16 14:01:55.588839333 +0000 UTC m=+139.025005061" Apr 16 14:01:55.891142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.891103 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj"] Apr 16 14:01:55.896064 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.896041 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:55.899073 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.898752 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 14:01:55.899073 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.898772 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-sc2kw\"" Apr 16 14:01:55.899073 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.898900 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 14:01:55.899073 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.898922 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 14:01:55.899452 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.899196 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 14:01:55.899623 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.899257 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-97lil5hau9sop\"" Apr 16 14:01:55.899735 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.899641 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 14:01:55.905286 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:55.905245 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj"] Apr 16 14:01:56.048722 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.048627 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.048722 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.048684 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.048722 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.048719 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-grpc-tls\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.048941 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.048755 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.048941 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.048792 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86145381-870a-416e-a665-1ef1232225b4-metrics-client-ca\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.048941 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.048931 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqrn5\" (UniqueName: \"kubernetes.io/projected/86145381-870a-416e-a665-1ef1232225b4-kube-api-access-fqrn5\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.049034 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.048961 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-tls\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.049034 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.048993 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.149901 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.149810 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.149901 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.149860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-grpc-tls\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.150123 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.149899 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.150123 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.149939 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86145381-870a-416e-a665-1ef1232225b4-metrics-client-ca\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.150123 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.149966 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqrn5\" (UniqueName: \"kubernetes.io/projected/86145381-870a-416e-a665-1ef1232225b4-kube-api-access-fqrn5\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.150123 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.149992 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-tls\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.150123 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.150022 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.150424 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.150140 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.151198 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.151163 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86145381-870a-416e-a665-1ef1232225b4-metrics-client-ca\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.153055 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.153005 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.153477 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.153427 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.153683 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.153624 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-tls\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.153969 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.153940 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.154080 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.154061 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-grpc-tls\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.154401 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.154385 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/86145381-870a-416e-a665-1ef1232225b4-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.158411 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.158391 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqrn5\" (UniqueName: \"kubernetes.io/projected/86145381-870a-416e-a665-1ef1232225b4-kube-api-access-fqrn5\") pod \"thanos-querier-8554b6c6b6-hkhrj\" (UID: \"86145381-870a-416e-a665-1ef1232225b4\") " pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.210031 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.209991 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:01:56.385724 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.385693 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj"] Apr 16 14:01:56.389126 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:01:56.389086 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86145381_870a_416e_a665_1ef1232225b4.slice/crio-317e07010c02d3371cf672838b3b229692681fd773241d0b07e2b05ad24da7aa WatchSource:0}: Error finding container 317e07010c02d3371cf672838b3b229692681fd773241d0b07e2b05ad24da7aa: Status 404 returned error can't find the container with id 317e07010c02d3371cf672838b3b229692681fd773241d0b07e2b05ad24da7aa Apr 16 14:01:56.582298 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.582186 2580 generic.go:358] "Generic (PLEG): container finished" podID="7127a2d6-0603-465d-a090-ca81178ba98d" containerID="4b68f9e9667ffb4f733a997aa98a9ada2792714298e4fa50e77f1a7baa5ac11d" exitCode=0 Apr 16 14:01:56.582722 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.582307 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qkptn" event={"ID":"7127a2d6-0603-465d-a090-ca81178ba98d","Type":"ContainerDied","Data":"4b68f9e9667ffb4f733a997aa98a9ada2792714298e4fa50e77f1a7baa5ac11d"} Apr 16 14:01:56.583886 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.583860 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dcd489fbf-rsjq7" event={"ID":"ae1941d9-4bee-4735-a3df-c001d4b0a8e5","Type":"ContainerStarted","Data":"9de052df27c461b800b41aa8d9e81095c23151f649c57fa674e383a423b6720f"} Apr 16 14:01:56.587256 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.587226 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" event={"ID":"48c8c7d4-4455-4581-866d-80f4d5c04319","Type":"ContainerStarted","Data":"02d9a86d2e56d4cb39b58cdd401844115a8c95526b59b82b202205602e774621"} Apr 16 14:01:56.588174 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.588151 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" event={"ID":"86145381-870a-416e-a665-1ef1232225b4","Type":"ContainerStarted","Data":"317e07010c02d3371cf672838b3b229692681fd773241d0b07e2b05ad24da7aa"} Apr 16 14:01:56.621507 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.621441 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dcd489fbf-rsjq7" podStartSLOduration=2.621421696 podStartE2EDuration="2.621421696s" podCreationTimestamp="2026-04-16 14:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:56.620298945 +0000 UTC m=+140.056464675" watchObservedRunningTime="2026-04-16 14:01:56.621421696 +0000 UTC m=+140.057587428" Apr 16 14:01:56.641018 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:01:56.640951 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-gc9ff" podStartSLOduration=2.271827025 podStartE2EDuration="4.640930869s" podCreationTimestamp="2026-04-16 14:01:52 +0000 UTC" firstStartedPulling="2026-04-16 14:01:53.582786771 +0000 UTC m=+137.018952484" lastFinishedPulling="2026-04-16 14:01:55.951890619 +0000 UTC m=+139.388056328" observedRunningTime="2026-04-16 14:01:56.639488476 +0000 UTC m=+140.075654206" watchObservedRunningTime="2026-04-16 14:01:56.640930869 +0000 UTC m=+140.077096645" Apr 16 14:02:00.857082 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:00.857043 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:02:00.857574 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:00.857098 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:02:00.858689 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:00.858666 2580 patch_prober.go:28] interesting pod/console-66d9c4554f-v6wpk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" start-of-body= Apr 16 14:02:00.858820 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:00.858743 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-66d9c4554f-v6wpk" podUID="cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" containerName="console" probeResult="failure" output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" Apr 16 14:02:03.144749 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.144714 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66d9c4554f-v6wpk"] Apr 16 14:02:03.174791 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.174757 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b7848b7dd-drn4q"] Apr 16 14:02:03.179432 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.179406 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.187573 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.187547 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b7848b7dd-drn4q"] Apr 16 14:02:03.222509 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.222459 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ttwh\" (UniqueName: \"kubernetes.io/projected/0b631acf-ab3d-47cf-827e-3bf2e6100a18-kube-api-access-2ttwh\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.222693 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.222529 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-config\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.222693 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.222558 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-oauth-config\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.222693 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.222596 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-service-ca\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.222849 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.222704 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-serving-cert\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.222849 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.222744 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-trusted-ca-bundle\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.222849 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.222774 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-oauth-serving-cert\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.324013 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.323975 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-trusted-ca-bundle\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.324218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.324032 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-oauth-serving-cert\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.324218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.324135 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ttwh\" (UniqueName: \"kubernetes.io/projected/0b631acf-ab3d-47cf-827e-3bf2e6100a18-kube-api-access-2ttwh\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.324218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.324189 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-config\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.324218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.324215 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-oauth-config\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.324465 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.324255 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-service-ca\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.324465 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.324329 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-serving-cert\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.325122 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.325092 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-trusted-ca-bundle\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.326125 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.325502 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-config\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.326125 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.326072 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-service-ca\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.326474 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.326444 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-oauth-serving-cert\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.327542 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.327515 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-serving-cert\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.328017 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.327993 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-oauth-config\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.334172 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.334103 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ttwh\" (UniqueName: \"kubernetes.io/projected/0b631acf-ab3d-47cf-827e-3bf2e6100a18-kube-api-access-2ttwh\") pod \"console-6b7848b7dd-drn4q\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.492706 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.492674 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:03.616558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.616517 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qkptn" event={"ID":"7127a2d6-0603-465d-a090-ca81178ba98d","Type":"ContainerStarted","Data":"a0351a6b2e09f2e94311b160185cc7044e8fa0b417d3235bb49cd6866e75ee86"} Apr 16 14:02:03.616558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.616566 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qkptn" event={"ID":"7127a2d6-0603-465d-a090-ca81178ba98d","Type":"ContainerStarted","Data":"396ff647555eaf0dfc5643e7257e39bd78d156c1b74b40abcc794b4167f42dba"} Apr 16 14:02:03.618257 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.618233 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-x7nxr" event={"ID":"6811b550-4e43-4c2d-a27b-a962e50de90d","Type":"ContainerStarted","Data":"dfc045d9793ab78db99ebccc3c2311bd5579eb3f3ffd22437e796ccca96faada"} Apr 16 14:02:03.618878 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.618854 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-x7nxr" Apr 16 14:02:03.620613 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.620585 2580 patch_prober.go:28] interesting pod/downloads-586b57c7b4-x7nxr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.15:8080/\": dial tcp 10.134.0.15:8080: connect: connection refused" start-of-body= Apr 16 14:02:03.620701 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.620638 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-586b57c7b4-x7nxr" podUID="6811b550-4e43-4c2d-a27b-a962e50de90d" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.15:8080/\": dial tcp 10.134.0.15:8080: connect: connection refused" Apr 16 14:02:03.630471 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.630311 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b7848b7dd-drn4q"] Apr 16 14:02:03.633475 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:02:03.633447 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b631acf_ab3d_47cf_827e_3bf2e6100a18.slice/crio-77d2b8987acfca392a86a0f0b80e50e0e76e2b0b61d35ce76f431c573f1a6f6c WatchSource:0}: Error finding container 77d2b8987acfca392a86a0f0b80e50e0e76e2b0b61d35ce76f431c573f1a6f6c: Status 404 returned error can't find the container with id 77d2b8987acfca392a86a0f0b80e50e0e76e2b0b61d35ce76f431c573f1a6f6c Apr 16 14:02:03.660834 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.660714 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-x7nxr" podStartSLOduration=0.979801802 podStartE2EDuration="18.660697863s" podCreationTimestamp="2026-04-16 14:01:45 +0000 UTC" firstStartedPulling="2026-04-16 14:01:45.768292853 +0000 UTC m=+129.204458576" lastFinishedPulling="2026-04-16 14:02:03.449188929 +0000 UTC m=+146.885354637" observedRunningTime="2026-04-16 14:02:03.658381966 +0000 UTC m=+147.094547711" watchObservedRunningTime="2026-04-16 14:02:03.660697863 +0000 UTC m=+147.096863591" Apr 16 14:02:03.660979 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:03.660834 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qkptn" podStartSLOduration=9.592669651 podStartE2EDuration="11.660825468s" podCreationTimestamp="2026-04-16 14:01:52 +0000 UTC" firstStartedPulling="2026-04-16 14:01:53.260209776 +0000 UTC m=+136.696375487" lastFinishedPulling="2026-04-16 14:01:55.328365584 +0000 UTC m=+138.764531304" observedRunningTime="2026-04-16 14:02:03.636118552 +0000 UTC m=+147.072284282" watchObservedRunningTime="2026-04-16 14:02:03.660825468 +0000 UTC m=+147.096991197" Apr 16 14:02:04.625428 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:04.624966 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b7848b7dd-drn4q" event={"ID":"0b631acf-ab3d-47cf-827e-3bf2e6100a18","Type":"ContainerStarted","Data":"920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef"} Apr 16 14:02:04.625428 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:04.625063 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b7848b7dd-drn4q" event={"ID":"0b631acf-ab3d-47cf-827e-3bf2e6100a18","Type":"ContainerStarted","Data":"77d2b8987acfca392a86a0f0b80e50e0e76e2b0b61d35ce76f431c573f1a6f6c"} Apr 16 14:02:04.643868 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:04.643832 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-x7nxr" Apr 16 14:02:04.647342 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:04.646225 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b7848b7dd-drn4q" podStartSLOduration=1.6462068699999999 podStartE2EDuration="1.64620687s" podCreationTimestamp="2026-04-16 14:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:02:04.64501776 +0000 UTC m=+148.081183486" watchObservedRunningTime="2026-04-16 14:02:04.64620687 +0000 UTC m=+148.082372600" Apr 16 14:02:05.071941 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.071855 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:02:05.071941 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.071916 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:02:05.073294 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.073236 2580 patch_prober.go:28] interesting pod/console-5dcd489fbf-rsjq7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.19:8443/health\": dial tcp 10.134.0.19:8443: connect: connection refused" start-of-body= Apr 16 14:02:05.073439 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.073323 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-5dcd489fbf-rsjq7" podUID="ae1941d9-4bee-4735-a3df-c001d4b0a8e5" containerName="console" probeResult="failure" output="Get \"https://10.134.0.19:8443/health\": dial tcp 10.134.0.19:8443: connect: connection refused" Apr 16 14:02:05.280840 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.279800 2580 patch_prober.go:28] interesting pod/image-registry-69db645876-4xdsr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:05.280840 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.279867 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-69db645876-4xdsr" podUID="94f41185-ca08-41f9-bc9f-f22802de6d09" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:05.630215 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.630176 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" event={"ID":"86145381-870a-416e-a665-1ef1232225b4","Type":"ContainerStarted","Data":"9a471f831263881ee7ddcd94859166579235e15e7cbe45e1138b23901194ee95"} Apr 16 14:02:05.630215 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.630222 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" event={"ID":"86145381-870a-416e-a665-1ef1232225b4","Type":"ContainerStarted","Data":"7514d30cac82cde343193b0501f66f5d4bf974d1ab665575699e59efa39ce93d"} Apr 16 14:02:05.630712 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.630235 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" event={"ID":"86145381-870a-416e-a665-1ef1232225b4","Type":"ContainerStarted","Data":"f0e1da2f0a9dd2a427ece135a541885760b1935bc7d56451986975a6493d02ec"} Apr 16 14:02:05.633772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.633698 2580 patch_prober.go:28] interesting pod/image-registry-54bd994b84-872wz container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:05.633919 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:05.633796 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-54bd994b84-872wz" podUID="5ba4c449-3beb-4e89-93c7-614b31fdfa9d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:07.525004 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:07.524965 2580 patch_prober.go:28] interesting pod/image-registry-54bd994b84-872wz container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:07.525531 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:07.525050 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-54bd994b84-872wz" podUID="5ba4c449-3beb-4e89-93c7-614b31fdfa9d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:08.642746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:08.642707 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" event={"ID":"86145381-870a-416e-a665-1ef1232225b4","Type":"ContainerStarted","Data":"3baf198e784c18660f9dae9424be514ce42443fd6d899b3ba31fa48673be2a0e"} Apr 16 14:02:08.642746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:08.642747 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" event={"ID":"86145381-870a-416e-a665-1ef1232225b4","Type":"ContainerStarted","Data":"bca94f2c22e40c6c57244488dcca8b7abaa661cc72c8fc2b8a72e8fa85e6ea2b"} Apr 16 14:02:08.643312 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:08.642761 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" event={"ID":"86145381-870a-416e-a665-1ef1232225b4","Type":"ContainerStarted","Data":"4ae388e4396aad62901b20895707fc0cd6abc9bc5f170a10fcba84c29873cc3f"} Apr 16 14:02:08.643312 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:08.642872 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:02:08.667797 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:08.667737 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" podStartSLOduration=2.482412679 podStartE2EDuration="13.667721007s" podCreationTimestamp="2026-04-16 14:01:55 +0000 UTC" firstStartedPulling="2026-04-16 14:01:56.391371466 +0000 UTC m=+139.827537179" lastFinishedPulling="2026-04-16 14:02:07.576679787 +0000 UTC m=+151.012845507" observedRunningTime="2026-04-16 14:02:08.666637776 +0000 UTC m=+152.102803504" watchObservedRunningTime="2026-04-16 14:02:08.667721007 +0000 UTC m=+152.103886742" Apr 16 14:02:10.293033 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.292980 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-69db645876-4xdsr" podUID="94f41185-ca08-41f9-bc9f-f22802de6d09" containerName="registry" containerID="cri-o://be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e" gracePeriod=30 Apr 16 14:02:10.569779 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.569756 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:02:10.650353 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.650319 2580 generic.go:358] "Generic (PLEG): container finished" podID="94f41185-ca08-41f9-bc9f-f22802de6d09" containerID="be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e" exitCode=0 Apr 16 14:02:10.650536 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.650397 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69db645876-4xdsr" Apr 16 14:02:10.650536 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.650404 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69db645876-4xdsr" event={"ID":"94f41185-ca08-41f9-bc9f-f22802de6d09","Type":"ContainerDied","Data":"be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e"} Apr 16 14:02:10.650536 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.650440 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69db645876-4xdsr" event={"ID":"94f41185-ca08-41f9-bc9f-f22802de6d09","Type":"ContainerDied","Data":"4c9c4f0b0c785341946ba453c7112c4173d02a0a8392416a54ed5a9fc6bfbe0d"} Apr 16 14:02:10.650536 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.650460 2580 scope.go:117] "RemoveContainer" containerID="be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e" Apr 16 14:02:10.664282 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.664240 2580 scope.go:117] "RemoveContainer" containerID="be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e" Apr 16 14:02:10.664683 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:02:10.664657 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e\": container with ID starting with be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e not found: ID does not exist" containerID="be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e" Apr 16 14:02:10.664792 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.664694 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e"} err="failed to get container status \"be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e\": rpc error: code = NotFound desc = could not find container \"be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e\": container with ID starting with be78499cb63ab9406203668e614e801d38f8a80bc4bd43ba5ba5747d3589b90e not found: ID does not exist" Apr 16 14:02:10.700209 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.700175 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrsv7\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-kube-api-access-hrsv7\") pod \"94f41185-ca08-41f9-bc9f-f22802de6d09\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " Apr 16 14:02:10.700389 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.700228 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls\") pod \"94f41185-ca08-41f9-bc9f-f22802de6d09\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " Apr 16 14:02:10.700389 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.700308 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-certificates\") pod \"94f41185-ca08-41f9-bc9f-f22802de6d09\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " Apr 16 14:02:10.700389 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.700343 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-installation-pull-secrets\") pod \"94f41185-ca08-41f9-bc9f-f22802de6d09\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " Apr 16 14:02:10.700389 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.700364 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94f41185-ca08-41f9-bc9f-f22802de6d09-ca-trust-extracted\") pod \"94f41185-ca08-41f9-bc9f-f22802de6d09\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " Apr 16 14:02:10.700605 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.700390 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-trusted-ca\") pod \"94f41185-ca08-41f9-bc9f-f22802de6d09\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " Apr 16 14:02:10.700605 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.700418 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-image-registry-private-configuration\") pod \"94f41185-ca08-41f9-bc9f-f22802de6d09\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " Apr 16 14:02:10.700605 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.700458 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-bound-sa-token\") pod \"94f41185-ca08-41f9-bc9f-f22802de6d09\" (UID: \"94f41185-ca08-41f9-bc9f-f22802de6d09\") " Apr 16 14:02:10.700736 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.700708 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "94f41185-ca08-41f9-bc9f-f22802de6d09" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:10.700833 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.700812 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-certificates\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:10.701172 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.701136 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "94f41185-ca08-41f9-bc9f-f22802de6d09" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:10.702959 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.702931 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "94f41185-ca08-41f9-bc9f-f22802de6d09" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:10.703091 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.703072 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "94f41185-ca08-41f9-bc9f-f22802de6d09" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:10.703179 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.703116 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-kube-api-access-hrsv7" (OuterVolumeSpecName: "kube-api-access-hrsv7") pod "94f41185-ca08-41f9-bc9f-f22802de6d09" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09"). InnerVolumeSpecName "kube-api-access-hrsv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:10.703345 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.703323 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "94f41185-ca08-41f9-bc9f-f22802de6d09" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:10.703476 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.703463 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "94f41185-ca08-41f9-bc9f-f22802de6d09" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:10.710439 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.710405 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f41185-ca08-41f9-bc9f-f22802de6d09-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "94f41185-ca08-41f9-bc9f-f22802de6d09" (UID: "94f41185-ca08-41f9-bc9f-f22802de6d09"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:02:10.802307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.802191 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hrsv7\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-kube-api-access-hrsv7\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:10.802307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.802238 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-registry-tls\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:10.802307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.802249 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-installation-pull-secrets\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:10.802307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.802257 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94f41185-ca08-41f9-bc9f-f22802de6d09-ca-trust-extracted\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:10.802307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.802292 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94f41185-ca08-41f9-bc9f-f22802de6d09-trusted-ca\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:10.802307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.802302 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94f41185-ca08-41f9-bc9f-f22802de6d09-image-registry-private-configuration\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:10.802307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.802310 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94f41185-ca08-41f9-bc9f-f22802de6d09-bound-sa-token\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:10.978853 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.978811 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69db645876-4xdsr"] Apr 16 14:02:10.982610 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:10.982583 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69db645876-4xdsr"] Apr 16 14:02:11.153136 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:11.153102 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94f41185-ca08-41f9-bc9f-f22802de6d09" path="/var/lib/kubelet/pods/94f41185-ca08-41f9-bc9f-f22802de6d09/volumes" Apr 16 14:02:13.493181 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:13.493145 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:13.493794 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:13.493203 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:13.494836 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:13.494806 2580 patch_prober.go:28] interesting pod/console-6b7848b7dd-drn4q container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" start-of-body= Apr 16 14:02:13.494938 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:13.494865 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6b7848b7dd-drn4q" podUID="0b631acf-ab3d-47cf-827e-3bf2e6100a18" containerName="console" probeResult="failure" output="Get \"https://10.134.0.21:8443/health\": dial tcp 10.134.0.21:8443: connect: connection refused" Apr 16 14:02:13.511513 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:02:13.511470 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-s7ltv" podUID="6da34735-0aa6-4efc-88b2-81738c442f3f" Apr 16 14:02:13.521621 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:02:13.521579 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bnz8h" podUID="4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8" Apr 16 14:02:13.661970 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:13.661941 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s7ltv" Apr 16 14:02:14.653385 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:14.653355 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-8554b6c6b6-hkhrj" Apr 16 14:02:15.071940 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:15.071904 2580 patch_prober.go:28] interesting pod/console-5dcd489fbf-rsjq7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.19:8443/health\": dial tcp 10.134.0.19:8443: connect: connection refused" start-of-body= Apr 16 14:02:15.072104 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:15.071959 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-5dcd489fbf-rsjq7" podUID="ae1941d9-4bee-4735-a3df-c001d4b0a8e5" containerName="console" probeResult="failure" output="Get \"https://10.134.0.19:8443/health\": dial tcp 10.134.0.19:8443: connect: connection refused" Apr 16 14:02:15.633116 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:15.633082 2580 patch_prober.go:28] interesting pod/image-registry-54bd994b84-872wz container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:15.633336 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:15.633146 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-54bd994b84-872wz" podUID="5ba4c449-3beb-4e89-93c7-614b31fdfa9d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:17.523367 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:17.523332 2580 patch_prober.go:28] interesting pod/image-registry-54bd994b84-872wz container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:17.523745 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:17.523384 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-54bd994b84-872wz" podUID="5ba4c449-3beb-4e89-93c7-614b31fdfa9d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:18.472348 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:18.472313 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:02:18.472348 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:18.472356 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:02:18.474870 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:18.474847 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6da34735-0aa6-4efc-88b2-81738c442f3f-metrics-tls\") pod \"dns-default-s7ltv\" (UID: \"6da34735-0aa6-4efc-88b2-81738c442f3f\") " pod="openshift-dns/dns-default-s7ltv" Apr 16 14:02:18.487206 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:18.487181 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8-cert\") pod \"ingress-canary-bnz8h\" (UID: \"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8\") " pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:02:18.765913 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:18.765831 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5ngv\"" Apr 16 14:02:18.773832 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:18.773803 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s7ltv" Apr 16 14:02:18.901493 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:18.901411 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s7ltv"] Apr 16 14:02:18.903717 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:02:18.903682 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da34735_0aa6_4efc_88b2_81738c442f3f.slice/crio-55d4e934e4bcae950e5bb3d5e5e90b05039886125615eeda2904e28e4fe32d63 WatchSource:0}: Error finding container 55d4e934e4bcae950e5bb3d5e5e90b05039886125615eeda2904e28e4fe32d63: Status 404 returned error can't find the container with id 55d4e934e4bcae950e5bb3d5e5e90b05039886125615eeda2904e28e4fe32d63 Apr 16 14:02:19.679559 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:19.679514 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s7ltv" event={"ID":"6da34735-0aa6-4efc-88b2-81738c442f3f","Type":"ContainerStarted","Data":"55d4e934e4bcae950e5bb3d5e5e90b05039886125615eeda2904e28e4fe32d63"} Apr 16 14:02:21.687569 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:21.687520 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s7ltv" event={"ID":"6da34735-0aa6-4efc-88b2-81738c442f3f","Type":"ContainerStarted","Data":"e85b9ed1e9f9baea412362b1fdd28d66b2e121066c55253341e2fa613022337e"} Apr 16 14:02:21.687569 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:21.687573 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s7ltv" event={"ID":"6da34735-0aa6-4efc-88b2-81738c442f3f","Type":"ContainerStarted","Data":"598491bd38f1ced8e9eb0ecbd8169abec4db2c3bd3c7ee75f2b28eaaa5fe2052"} Apr 16 14:02:21.688077 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:21.687714 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-s7ltv" Apr 16 14:02:21.707572 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:21.707515 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-s7ltv" podStartSLOduration=129.606669474 podStartE2EDuration="2m11.707502224s" podCreationTimestamp="2026-04-16 14:00:10 +0000 UTC" firstStartedPulling="2026-04-16 14:02:18.905684697 +0000 UTC m=+162.341850405" lastFinishedPulling="2026-04-16 14:02:21.006517447 +0000 UTC m=+164.442683155" observedRunningTime="2026-04-16 14:02:21.705305292 +0000 UTC m=+165.141471017" watchObservedRunningTime="2026-04-16 14:02:21.707502224 +0000 UTC m=+165.143667976" Apr 16 14:02:23.497541 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:23.497512 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:23.501499 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:23.501477 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:02:23.571164 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:23.571129 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dcd489fbf-rsjq7"] Apr 16 14:02:25.633030 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:25.632995 2580 patch_prober.go:28] interesting pod/image-registry-54bd994b84-872wz container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:25.633481 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:25.633047 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-54bd994b84-872wz" podUID="5ba4c449-3beb-4e89-93c7-614b31fdfa9d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:25.633481 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:25.633085 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:02:25.633608 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:25.633543 2580 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"c123d82bf46e2b8d5647fdfebee7b1f5975c5deeab3fe94608f5669612e1956d"} pod="openshift-image-registry/image-registry-54bd994b84-872wz" containerMessage="Container registry failed liveness probe, will be restarted" Apr 16 14:02:25.636859 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:25.636833 2580 patch_prober.go:28] interesting pod/image-registry-54bd994b84-872wz container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:25.636984 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:25.636872 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-54bd994b84-872wz" podUID="5ba4c449-3beb-4e89-93c7-614b31fdfa9d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:27.149447 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:27.149409 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:02:27.152537 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:27.152517 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-r2bfh\"" Apr 16 14:02:27.160653 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:27.160636 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bnz8h" Apr 16 14:02:27.284069 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:27.284033 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bnz8h"] Apr 16 14:02:27.287043 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:02:27.287007 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f0fe291_cfdb_4a2a_afac_62a29d0fbfa8.slice/crio-61881dd6633147f5ca4b7e27dea802a1c194573b97de7e680f07c643413934f1 WatchSource:0}: Error finding container 61881dd6633147f5ca4b7e27dea802a1c194573b97de7e680f07c643413934f1: Status 404 returned error can't find the container with id 61881dd6633147f5ca4b7e27dea802a1c194573b97de7e680f07c643413934f1 Apr 16 14:02:27.710563 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:27.710522 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bnz8h" event={"ID":"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8","Type":"ContainerStarted","Data":"61881dd6633147f5ca4b7e27dea802a1c194573b97de7e680f07c643413934f1"} Apr 16 14:02:28.166807 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.166761 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66d9c4554f-v6wpk" podUID="cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" containerName="console" containerID="cri-o://86bdf9170ab8e6006477d3d43670ba73ac222bae7f17fe6fab3f337220ef597d" gracePeriod=15 Apr 16 14:02:28.716294 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.716250 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66d9c4554f-v6wpk_cfaddedb-dd86-4549-b03e-a26c6e8ffe8d/console/0.log" Apr 16 14:02:28.716492 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.716327 2580 generic.go:358] "Generic (PLEG): container finished" podID="cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" containerID="86bdf9170ab8e6006477d3d43670ba73ac222bae7f17fe6fab3f337220ef597d" exitCode=2 Apr 16 14:02:28.716492 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.716407 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66d9c4554f-v6wpk" event={"ID":"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d","Type":"ContainerDied","Data":"86bdf9170ab8e6006477d3d43670ba73ac222bae7f17fe6fab3f337220ef597d"} Apr 16 14:02:28.820025 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.820004 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66d9c4554f-v6wpk_cfaddedb-dd86-4549-b03e-a26c6e8ffe8d/console/0.log" Apr 16 14:02:28.820145 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.820065 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:02:28.962991 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.962958 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-config\") pod \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " Apr 16 14:02:28.963163 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.963021 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-service-ca\") pod \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " Apr 16 14:02:28.963163 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.963044 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-serving-cert\") pod \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " Apr 16 14:02:28.963314 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.963164 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-oauth-serving-cert\") pod \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " Apr 16 14:02:28.963314 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.963202 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frhfk\" (UniqueName: \"kubernetes.io/projected/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-kube-api-access-frhfk\") pod \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " Apr 16 14:02:28.963314 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.963237 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-oauth-config\") pod \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\" (UID: \"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d\") " Apr 16 14:02:28.963524 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.963498 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-service-ca" (OuterVolumeSpecName: "service-ca") pod "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" (UID: "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:28.963586 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.963505 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-config" (OuterVolumeSpecName: "console-config") pod "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" (UID: "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:28.963926 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.963794 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" (UID: "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:28.965655 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.965630 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" (UID: "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:28.965773 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.965681 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" (UID: "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:28.965773 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:28.965729 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-kube-api-access-frhfk" (OuterVolumeSpecName: "kube-api-access-frhfk") pod "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" (UID: "cfaddedb-dd86-4549-b03e-a26c6e8ffe8d"). InnerVolumeSpecName "kube-api-access-frhfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:29.064218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.064166 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-service-ca\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:29.064218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.064212 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-serving-cert\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:29.064218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.064229 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-oauth-serving-cert\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:29.064553 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.064243 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-frhfk\" (UniqueName: \"kubernetes.io/projected/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-kube-api-access-frhfk\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:29.064553 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.064258 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-oauth-config\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:29.064553 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.064296 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d-console-config\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:29.721024 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.720985 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bnz8h" event={"ID":"4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8","Type":"ContainerStarted","Data":"29eb03045af4a42c03e774c881cb4e1fe660e66c9fb96e99acc341e677802ce0"} Apr 16 14:02:29.722108 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.722091 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66d9c4554f-v6wpk_cfaddedb-dd86-4549-b03e-a26c6e8ffe8d/console/0.log" Apr 16 14:02:29.722211 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.722177 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66d9c4554f-v6wpk" event={"ID":"cfaddedb-dd86-4549-b03e-a26c6e8ffe8d","Type":"ContainerDied","Data":"bb687a2482ee983918437d355f404945139ef96b7d1d5524cafb93eafc8967ac"} Apr 16 14:02:29.722211 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.722197 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66d9c4554f-v6wpk" Apr 16 14:02:29.722322 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.722202 2580 scope.go:117] "RemoveContainer" containerID="86bdf9170ab8e6006477d3d43670ba73ac222bae7f17fe6fab3f337220ef597d" Apr 16 14:02:29.737909 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.737861 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bnz8h" podStartSLOduration=138.173748487 podStartE2EDuration="2m19.737843446s" podCreationTimestamp="2026-04-16 14:00:10 +0000 UTC" firstStartedPulling="2026-04-16 14:02:27.28884035 +0000 UTC m=+170.725006060" lastFinishedPulling="2026-04-16 14:02:28.852935312 +0000 UTC m=+172.289101019" observedRunningTime="2026-04-16 14:02:29.737344074 +0000 UTC m=+173.173509792" watchObservedRunningTime="2026-04-16 14:02:29.737843446 +0000 UTC m=+173.174009175" Apr 16 14:02:29.754718 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.754685 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66d9c4554f-v6wpk"] Apr 16 14:02:29.758140 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:29.758110 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66d9c4554f-v6wpk"] Apr 16 14:02:31.152858 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:31.152820 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" path="/var/lib/kubelet/pods/cfaddedb-dd86-4549-b03e-a26c6e8ffe8d/volumes" Apr 16 14:02:31.694325 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:31.694294 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-s7ltv" Apr 16 14:02:35.637128 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:35.637100 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:02:36.743979 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:36.743943 2580 generic.go:358] "Generic (PLEG): container finished" podID="bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38" containerID="64ee35c5da680c46692375cb2aedd3b4961ebab998a71187935e73c2aecc0e21" exitCode=0 Apr 16 14:02:36.744368 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:36.744021 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" event={"ID":"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38","Type":"ContainerDied","Data":"64ee35c5da680c46692375cb2aedd3b4961ebab998a71187935e73c2aecc0e21"} Apr 16 14:02:36.744410 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:36.744379 2580 scope.go:117] "RemoveContainer" containerID="64ee35c5da680c46692375cb2aedd3b4961ebab998a71187935e73c2aecc0e21" Apr 16 14:02:37.748817 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:37.748777 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-24gbs" event={"ID":"bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38","Type":"ContainerStarted","Data":"e6374988c303e77062ae632917ecc64a4cdba869018880145edb021708b87123"} Apr 16 14:02:48.591337 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.591257 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5dcd489fbf-rsjq7" podUID="ae1941d9-4bee-4735-a3df-c001d4b0a8e5" containerName="console" containerID="cri-o://9de052df27c461b800b41aa8d9e81095c23151f649c57fa674e383a423b6720f" gracePeriod=15 Apr 16 14:02:48.783977 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.783950 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dcd489fbf-rsjq7_ae1941d9-4bee-4735-a3df-c001d4b0a8e5/console/0.log" Apr 16 14:02:48.784144 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.783998 2580 generic.go:358] "Generic (PLEG): container finished" podID="ae1941d9-4bee-4735-a3df-c001d4b0a8e5" containerID="9de052df27c461b800b41aa8d9e81095c23151f649c57fa674e383a423b6720f" exitCode=2 Apr 16 14:02:48.784144 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.784048 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dcd489fbf-rsjq7" event={"ID":"ae1941d9-4bee-4735-a3df-c001d4b0a8e5","Type":"ContainerDied","Data":"9de052df27c461b800b41aa8d9e81095c23151f649c57fa674e383a423b6720f"} Apr 16 14:02:48.869514 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.869492 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dcd489fbf-rsjq7_ae1941d9-4bee-4735-a3df-c001d4b0a8e5/console/0.log" Apr 16 14:02:48.869626 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.869552 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:02:48.932398 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.932368 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-serving-cert\") pod \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " Apr 16 14:02:48.932580 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.932411 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-service-ca\") pod \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " Apr 16 14:02:48.932580 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.932446 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tshrt\" (UniqueName: \"kubernetes.io/projected/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-kube-api-access-tshrt\") pod \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " Apr 16 14:02:48.932580 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.932469 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-oauth-serving-cert\") pod \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " Apr 16 14:02:48.932580 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.932514 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-oauth-config\") pod \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " Apr 16 14:02:48.932580 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.932537 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-trusted-ca-bundle\") pod \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " Apr 16 14:02:48.932580 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.932580 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-config\") pod \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\" (UID: \"ae1941d9-4bee-4735-a3df-c001d4b0a8e5\") " Apr 16 14:02:48.932875 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.932840 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-service-ca" (OuterVolumeSpecName: "service-ca") pod "ae1941d9-4bee-4735-a3df-c001d4b0a8e5" (UID: "ae1941d9-4bee-4735-a3df-c001d4b0a8e5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:48.933009 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.932966 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ae1941d9-4bee-4735-a3df-c001d4b0a8e5" (UID: "ae1941d9-4bee-4735-a3df-c001d4b0a8e5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:48.933102 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.933038 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-config" (OuterVolumeSpecName: "console-config") pod "ae1941d9-4bee-4735-a3df-c001d4b0a8e5" (UID: "ae1941d9-4bee-4735-a3df-c001d4b0a8e5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:48.933172 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.933155 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ae1941d9-4bee-4735-a3df-c001d4b0a8e5" (UID: "ae1941d9-4bee-4735-a3df-c001d4b0a8e5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:48.935116 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.935091 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ae1941d9-4bee-4735-a3df-c001d4b0a8e5" (UID: "ae1941d9-4bee-4735-a3df-c001d4b0a8e5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:48.935350 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.935330 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ae1941d9-4bee-4735-a3df-c001d4b0a8e5" (UID: "ae1941d9-4bee-4735-a3df-c001d4b0a8e5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:48.935420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:48.935383 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-kube-api-access-tshrt" (OuterVolumeSpecName: "kube-api-access-tshrt") pod "ae1941d9-4bee-4735-a3df-c001d4b0a8e5" (UID: "ae1941d9-4bee-4735-a3df-c001d4b0a8e5"). InnerVolumeSpecName "kube-api-access-tshrt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:49.033399 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.033362 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-serving-cert\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:49.033399 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.033397 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-service-ca\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:49.033399 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.033408 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tshrt\" (UniqueName: \"kubernetes.io/projected/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-kube-api-access-tshrt\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:49.033621 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.033418 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-oauth-serving-cert\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:49.033621 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.033428 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-oauth-config\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:49.033621 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.033437 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-trusted-ca-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:49.033621 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.033448 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae1941d9-4bee-4735-a3df-c001d4b0a8e5-console-config\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:02:49.789437 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.789407 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dcd489fbf-rsjq7_ae1941d9-4bee-4735-a3df-c001d4b0a8e5/console/0.log" Apr 16 14:02:49.789888 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.789500 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dcd489fbf-rsjq7" event={"ID":"ae1941d9-4bee-4735-a3df-c001d4b0a8e5","Type":"ContainerDied","Data":"a159615bf57f8e77898408fc014b74673b3b3d36fc024c7990a7f26152db9dd9"} Apr 16 14:02:49.789888 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.789527 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dcd489fbf-rsjq7" Apr 16 14:02:49.789888 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.789541 2580 scope.go:117] "RemoveContainer" containerID="9de052df27c461b800b41aa8d9e81095c23151f649c57fa674e383a423b6720f" Apr 16 14:02:49.810437 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.810400 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dcd489fbf-rsjq7"] Apr 16 14:02:49.815993 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:49.815967 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5dcd489fbf-rsjq7"] Apr 16 14:02:50.654781 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:50.654740 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-54bd994b84-872wz" podUID="5ba4c449-3beb-4e89-93c7-614b31fdfa9d" containerName="registry" containerID="cri-o://c123d82bf46e2b8d5647fdfebee7b1f5975c5deeab3fe94608f5669612e1956d" gracePeriod=30 Apr 16 14:02:50.793967 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:50.793932 2580 generic.go:358] "Generic (PLEG): container finished" podID="93bf1779-6f22-4509-a332-64a1d071a5a0" containerID="5181e2dd55d88588bdd619d0622580d42cddcfcdf80e9d2a7b0148c531f2de52" exitCode=0 Apr 16 14:02:50.794418 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:50.793967 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" event={"ID":"93bf1779-6f22-4509-a332-64a1d071a5a0","Type":"ContainerDied","Data":"5181e2dd55d88588bdd619d0622580d42cddcfcdf80e9d2a7b0148c531f2de52"} Apr 16 14:02:50.794481 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:50.794417 2580 scope.go:117] "RemoveContainer" containerID="5181e2dd55d88588bdd619d0622580d42cddcfcdf80e9d2a7b0148c531f2de52" Apr 16 14:02:51.152681 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:51.152648 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae1941d9-4bee-4735-a3df-c001d4b0a8e5" path="/var/lib/kubelet/pods/ae1941d9-4bee-4735-a3df-c001d4b0a8e5/volumes" Apr 16 14:02:51.799498 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:51.799444 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wv92n" event={"ID":"93bf1779-6f22-4509-a332-64a1d071a5a0","Type":"ContainerStarted","Data":"58f7cc9c8998fbc12a9a39077a564274f4ac2b273533f7f78ffa238d0f52831a"} Apr 16 14:02:51.800819 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:51.800798 2580 generic.go:358] "Generic (PLEG): container finished" podID="5ba4c449-3beb-4e89-93c7-614b31fdfa9d" containerID="c123d82bf46e2b8d5647fdfebee7b1f5975c5deeab3fe94608f5669612e1956d" exitCode=0 Apr 16 14:02:51.800917 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:51.800827 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54bd994b84-872wz" event={"ID":"5ba4c449-3beb-4e89-93c7-614b31fdfa9d","Type":"ContainerDied","Data":"c123d82bf46e2b8d5647fdfebee7b1f5975c5deeab3fe94608f5669612e1956d"} Apr 16 14:02:52.809847 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:52.809811 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54bd994b84-872wz" event={"ID":"5ba4c449-3beb-4e89-93c7-614b31fdfa9d","Type":"ContainerStarted","Data":"d1f9f6dc83d4add5de54a1bdcfae05b54510b8636e31a1e21f91c5a3dc013f4c"} Apr 16 14:02:52.810239 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:02:52.809927 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:03:13.816213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:13.816187 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54bd994b84-872wz" Apr 16 14:03:33.447573 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.447535 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-687949c9dd-r7xzt"] Apr 16 14:03:33.448142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.447859 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" containerName="console" Apr 16 14:03:33.448142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.447877 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" containerName="console" Apr 16 14:03:33.448142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.447886 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae1941d9-4bee-4735-a3df-c001d4b0a8e5" containerName="console" Apr 16 14:03:33.448142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.447891 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1941d9-4bee-4735-a3df-c001d4b0a8e5" containerName="console" Apr 16 14:03:33.448142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.447899 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94f41185-ca08-41f9-bc9f-f22802de6d09" containerName="registry" Apr 16 14:03:33.448142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.447905 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f41185-ca08-41f9-bc9f-f22802de6d09" containerName="registry" Apr 16 14:03:33.448142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.447984 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfaddedb-dd86-4549-b03e-a26c6e8ffe8d" containerName="console" Apr 16 14:03:33.448142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.447997 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae1941d9-4bee-4735-a3df-c001d4b0a8e5" containerName="console" Apr 16 14:03:33.448142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.448009 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="94f41185-ca08-41f9-bc9f-f22802de6d09" containerName="registry" Apr 16 14:03:33.449985 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.449967 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.461465 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.461440 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-687949c9dd-r7xzt"] Apr 16 14:03:33.592042 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.592009 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-trusted-ca-bundle\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.592042 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.592055 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-oauth-serving-cert\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.592247 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.592075 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-service-ca\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.592247 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.592093 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-config\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.592247 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.592191 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-oauth-config\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.592247 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.592228 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-serving-cert\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.592247 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.592248 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v24q6\" (UniqueName: \"kubernetes.io/projected/6b0c7df2-31b3-464e-ba87-8ee223a03e65-kube-api-access-v24q6\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.693397 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.693361 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-trusted-ca-bundle\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.693530 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.693402 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-oauth-serving-cert\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.693567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.693524 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-service-ca\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.693567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.693560 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-config\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.693652 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.693639 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-oauth-config\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.693698 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.693668 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-serving-cert\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.693750 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.693698 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v24q6\" (UniqueName: \"kubernetes.io/projected/6b0c7df2-31b3-464e-ba87-8ee223a03e65-kube-api-access-v24q6\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.694191 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.694169 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-oauth-serving-cert\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.694347 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.694322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-service-ca\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.694413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.694344 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-trusted-ca-bundle\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.694413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.694364 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-config\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.696394 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.696374 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-oauth-config\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.696479 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.696430 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-serving-cert\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.701470 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.701421 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v24q6\" (UniqueName: \"kubernetes.io/projected/6b0c7df2-31b3-464e-ba87-8ee223a03e65-kube-api-access-v24q6\") pod \"console-687949c9dd-r7xzt\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.759117 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.759080 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:33.879665 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.879617 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-687949c9dd-r7xzt"] Apr 16 14:03:33.881794 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:03:33.881769 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b0c7df2_31b3_464e_ba87_8ee223a03e65.slice/crio-c41e73d8b039ec1b60b4142afda8e3628ff3bc3baf357267d1a6a7b290c791b7 WatchSource:0}: Error finding container c41e73d8b039ec1b60b4142afda8e3628ff3bc3baf357267d1a6a7b290c791b7: Status 404 returned error can't find the container with id c41e73d8b039ec1b60b4142afda8e3628ff3bc3baf357267d1a6a7b290c791b7 Apr 16 14:03:33.933420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:33.933396 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-687949c9dd-r7xzt" event={"ID":"6b0c7df2-31b3-464e-ba87-8ee223a03e65","Type":"ContainerStarted","Data":"c41e73d8b039ec1b60b4142afda8e3628ff3bc3baf357267d1a6a7b290c791b7"} Apr 16 14:03:34.937420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:34.937375 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-687949c9dd-r7xzt" event={"ID":"6b0c7df2-31b3-464e-ba87-8ee223a03e65","Type":"ContainerStarted","Data":"93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e"} Apr 16 14:03:34.956328 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:34.956244 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-687949c9dd-r7xzt" podStartSLOduration=1.9562271519999999 podStartE2EDuration="1.956227152s" podCreationTimestamp="2026-04-16 14:03:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:03:34.954114829 +0000 UTC m=+238.390280558" watchObservedRunningTime="2026-04-16 14:03:34.956227152 +0000 UTC m=+238.392392875" Apr 16 14:03:43.759764 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:43.759723 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:43.760340 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:43.759800 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:43.764864 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:43.764840 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:43.965897 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:43.965868 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:03:44.015997 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:03:44.015910 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b7848b7dd-drn4q"] Apr 16 14:04:09.035738 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.035635 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b7848b7dd-drn4q" podUID="0b631acf-ab3d-47cf-827e-3bf2e6100a18" containerName="console" containerID="cri-o://920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef" gracePeriod=15 Apr 16 14:04:09.279629 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.279608 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b7848b7dd-drn4q_0b631acf-ab3d-47cf-827e-3bf2e6100a18/console/0.log" Apr 16 14:04:09.279738 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.279665 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:04:09.379341 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379306 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-oauth-config\") pod \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " Apr 16 14:04:09.379526 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379365 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ttwh\" (UniqueName: \"kubernetes.io/projected/0b631acf-ab3d-47cf-827e-3bf2e6100a18-kube-api-access-2ttwh\") pod \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " Apr 16 14:04:09.379526 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379411 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-trusted-ca-bundle\") pod \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " Apr 16 14:04:09.379526 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379440 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-config\") pod \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " Apr 16 14:04:09.379526 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379480 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-oauth-serving-cert\") pod \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " Apr 16 14:04:09.379791 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379527 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-service-ca\") pod \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " Apr 16 14:04:09.379791 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379568 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-serving-cert\") pod \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\" (UID: \"0b631acf-ab3d-47cf-827e-3bf2e6100a18\") " Apr 16 14:04:09.379943 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379910 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-config" (OuterVolumeSpecName: "console-config") pod "0b631acf-ab3d-47cf-827e-3bf2e6100a18" (UID: "0b631acf-ab3d-47cf-827e-3bf2e6100a18"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:04:09.380009 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379929 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0b631acf-ab3d-47cf-827e-3bf2e6100a18" (UID: "0b631acf-ab3d-47cf-827e-3bf2e6100a18"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:04:09.380009 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379936 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0b631acf-ab3d-47cf-827e-3bf2e6100a18" (UID: "0b631acf-ab3d-47cf-827e-3bf2e6100a18"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:04:09.380009 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.379942 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b631acf-ab3d-47cf-827e-3bf2e6100a18" (UID: "0b631acf-ab3d-47cf-827e-3bf2e6100a18"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:04:09.381741 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.381710 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0b631acf-ab3d-47cf-827e-3bf2e6100a18" (UID: "0b631acf-ab3d-47cf-827e-3bf2e6100a18"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:04:09.381847 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.381756 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b631acf-ab3d-47cf-827e-3bf2e6100a18-kube-api-access-2ttwh" (OuterVolumeSpecName: "kube-api-access-2ttwh") pod "0b631acf-ab3d-47cf-827e-3bf2e6100a18" (UID: "0b631acf-ab3d-47cf-827e-3bf2e6100a18"). InnerVolumeSpecName "kube-api-access-2ttwh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:04:09.381892 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.381853 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0b631acf-ab3d-47cf-827e-3bf2e6100a18" (UID: "0b631acf-ab3d-47cf-827e-3bf2e6100a18"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:04:09.480883 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.480846 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-config\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:04:09.480883 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.480876 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-oauth-serving-cert\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:04:09.480883 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.480885 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-service-ca\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:04:09.481107 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.480896 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-serving-cert\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:04:09.481107 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.480906 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b631acf-ab3d-47cf-827e-3bf2e6100a18-console-oauth-config\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:04:09.481107 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.480915 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ttwh\" (UniqueName: \"kubernetes.io/projected/0b631acf-ab3d-47cf-827e-3bf2e6100a18-kube-api-access-2ttwh\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:04:09.481107 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:09.480924 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b631acf-ab3d-47cf-827e-3bf2e6100a18-trusted-ca-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:04:10.036912 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.036886 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b7848b7dd-drn4q_0b631acf-ab3d-47cf-827e-3bf2e6100a18/console/0.log" Apr 16 14:04:10.037303 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.036925 2580 generic.go:358] "Generic (PLEG): container finished" podID="0b631acf-ab3d-47cf-827e-3bf2e6100a18" containerID="920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef" exitCode=2 Apr 16 14:04:10.037303 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.036986 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b7848b7dd-drn4q" Apr 16 14:04:10.037303 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.036995 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b7848b7dd-drn4q" event={"ID":"0b631acf-ab3d-47cf-827e-3bf2e6100a18","Type":"ContainerDied","Data":"920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef"} Apr 16 14:04:10.037303 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.037023 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b7848b7dd-drn4q" event={"ID":"0b631acf-ab3d-47cf-827e-3bf2e6100a18","Type":"ContainerDied","Data":"77d2b8987acfca392a86a0f0b80e50e0e76e2b0b61d35ce76f431c573f1a6f6c"} Apr 16 14:04:10.037303 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.037036 2580 scope.go:117] "RemoveContainer" containerID="920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef" Apr 16 14:04:10.045140 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.045123 2580 scope.go:117] "RemoveContainer" containerID="920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef" Apr 16 14:04:10.045432 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:04:10.045414 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef\": container with ID starting with 920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef not found: ID does not exist" containerID="920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef" Apr 16 14:04:10.045505 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.045440 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef"} err="failed to get container status \"920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef\": rpc error: code = NotFound desc = could not find container \"920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef\": container with ID starting with 920634e0a5f902d77d5e239fcc64f5323585f4cbdfea92d216ddc5e71edfbeef not found: ID does not exist" Apr 16 14:04:10.057907 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.057884 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b7848b7dd-drn4q"] Apr 16 14:04:10.060613 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.060594 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b7848b7dd-drn4q"] Apr 16 14:04:10.221137 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.221101 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-245wx"] Apr 16 14:04:10.221439 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.221426 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b631acf-ab3d-47cf-827e-3bf2e6100a18" containerName="console" Apr 16 14:04:10.221488 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.221441 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b631acf-ab3d-47cf-827e-3bf2e6100a18" containerName="console" Apr 16 14:04:10.221522 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.221498 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b631acf-ab3d-47cf-827e-3bf2e6100a18" containerName="console" Apr 16 14:04:10.223755 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.223740 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.226159 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.226129 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:04:10.232306 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.232287 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-245wx"] Apr 16 14:04:10.287960 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.287880 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/043c6d8a-c412-4df7-8745-09796830b9f1-kubelet-config\") pod \"global-pull-secret-syncer-245wx\" (UID: \"043c6d8a-c412-4df7-8745-09796830b9f1\") " pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.287960 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.287917 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/043c6d8a-c412-4df7-8745-09796830b9f1-dbus\") pod \"global-pull-secret-syncer-245wx\" (UID: \"043c6d8a-c412-4df7-8745-09796830b9f1\") " pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.288129 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.288020 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/043c6d8a-c412-4df7-8745-09796830b9f1-original-pull-secret\") pod \"global-pull-secret-syncer-245wx\" (UID: \"043c6d8a-c412-4df7-8745-09796830b9f1\") " pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.389077 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.389042 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/043c6d8a-c412-4df7-8745-09796830b9f1-original-pull-secret\") pod \"global-pull-secret-syncer-245wx\" (UID: \"043c6d8a-c412-4df7-8745-09796830b9f1\") " pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.389221 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.389086 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/043c6d8a-c412-4df7-8745-09796830b9f1-kubelet-config\") pod \"global-pull-secret-syncer-245wx\" (UID: \"043c6d8a-c412-4df7-8745-09796830b9f1\") " pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.389221 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.389109 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/043c6d8a-c412-4df7-8745-09796830b9f1-dbus\") pod \"global-pull-secret-syncer-245wx\" (UID: \"043c6d8a-c412-4df7-8745-09796830b9f1\") " pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.389321 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.389205 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/043c6d8a-c412-4df7-8745-09796830b9f1-kubelet-config\") pod \"global-pull-secret-syncer-245wx\" (UID: \"043c6d8a-c412-4df7-8745-09796830b9f1\") " pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.389321 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.389295 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/043c6d8a-c412-4df7-8745-09796830b9f1-dbus\") pod \"global-pull-secret-syncer-245wx\" (UID: \"043c6d8a-c412-4df7-8745-09796830b9f1\") " pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.391492 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.391470 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/043c6d8a-c412-4df7-8745-09796830b9f1-original-pull-secret\") pod \"global-pull-secret-syncer-245wx\" (UID: \"043c6d8a-c412-4df7-8745-09796830b9f1\") " pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.533097 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.533054 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-245wx" Apr 16 14:04:10.654203 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:10.654172 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-245wx"] Apr 16 14:04:10.657247 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:04:10.657208 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod043c6d8a_c412_4df7_8745_09796830b9f1.slice/crio-dc7ae7ad0fbce0481e09d0cb386cd01cd07cba9355861d65eb4bafc6584a8945 WatchSource:0}: Error finding container dc7ae7ad0fbce0481e09d0cb386cd01cd07cba9355861d65eb4bafc6584a8945: Status 404 returned error can't find the container with id dc7ae7ad0fbce0481e09d0cb386cd01cd07cba9355861d65eb4bafc6584a8945 Apr 16 14:04:11.041831 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:11.041796 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-245wx" event={"ID":"043c6d8a-c412-4df7-8745-09796830b9f1","Type":"ContainerStarted","Data":"dc7ae7ad0fbce0481e09d0cb386cd01cd07cba9355861d65eb4bafc6584a8945"} Apr 16 14:04:11.152150 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:11.152119 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b631acf-ab3d-47cf-827e-3bf2e6100a18" path="/var/lib/kubelet/pods/0b631acf-ab3d-47cf-827e-3bf2e6100a18/volumes" Apr 16 14:04:16.056793 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:16.056753 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-245wx" event={"ID":"043c6d8a-c412-4df7-8745-09796830b9f1","Type":"ContainerStarted","Data":"688d27d91c2e63d43f3fd9d0fa1a2a258932824c5a42585c0246f30d3a37c664"} Apr 16 14:04:16.069778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:16.069728 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-245wx" podStartSLOduration=1.50990047 podStartE2EDuration="6.069709693s" podCreationTimestamp="2026-04-16 14:04:10 +0000 UTC" firstStartedPulling="2026-04-16 14:04:10.658792296 +0000 UTC m=+274.094958014" lastFinishedPulling="2026-04-16 14:04:15.218601531 +0000 UTC m=+278.654767237" observedRunningTime="2026-04-16 14:04:16.069693865 +0000 UTC m=+279.505859594" watchObservedRunningTime="2026-04-16 14:04:16.069709693 +0000 UTC m=+279.505875417" Apr 16 14:04:37.015010 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:37.014974 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:04:37.015687 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:37.015369 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:04:37.032994 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:04:37.032971 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:05:19.464314 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.464263 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc"] Apr 16 14:05:19.466438 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.466422 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.469169 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.469145 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:05:19.469321 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.469198 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tl9st\"" Apr 16 14:05:19.490454 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.489816 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:05:19.495353 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.493087 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc"] Apr 16 14:05:19.533984 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.533947 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8krc\" (UniqueName: \"kubernetes.io/projected/f44765a5-cf2e-4167-a0e9-e12d257427aa-kube-api-access-g8krc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.534152 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.534027 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.534152 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.534044 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.635437 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.635393 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.635437 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.635439 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.635635 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.635477 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8krc\" (UniqueName: \"kubernetes.io/projected/f44765a5-cf2e-4167-a0e9-e12d257427aa-kube-api-access-g8krc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.635789 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.635769 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.635828 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.635795 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.644785 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.644755 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8krc\" (UniqueName: \"kubernetes.io/projected/f44765a5-cf2e-4167-a0e9-e12d257427aa-kube-api-access-g8krc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.775612 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.775534 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:19.899319 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.899292 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc"] Apr 16 14:05:19.902079 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:05:19.902053 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf44765a5_cf2e_4167_a0e9_e12d257427aa.slice/crio-2ef04bd195c40ae40867cc61ea0c987cfcce342d15380a200587cffb590f5b75 WatchSource:0}: Error finding container 2ef04bd195c40ae40867cc61ea0c987cfcce342d15380a200587cffb590f5b75: Status 404 returned error can't find the container with id 2ef04bd195c40ae40867cc61ea0c987cfcce342d15380a200587cffb590f5b75 Apr 16 14:05:19.904039 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:19.904018 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:05:20.228799 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:20.228762 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" event={"ID":"f44765a5-cf2e-4167-a0e9-e12d257427aa","Type":"ContainerStarted","Data":"2ef04bd195c40ae40867cc61ea0c987cfcce342d15380a200587cffb590f5b75"} Apr 16 14:05:26.253004 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:26.252967 2580 generic.go:358] "Generic (PLEG): container finished" podID="f44765a5-cf2e-4167-a0e9-e12d257427aa" containerID="46f2264d7d0b5e98bebdb3c67159407f0968fa4ce64fc990be51a9294b01456b" exitCode=0 Apr 16 14:05:26.253484 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:26.253059 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" event={"ID":"f44765a5-cf2e-4167-a0e9-e12d257427aa","Type":"ContainerDied","Data":"46f2264d7d0b5e98bebdb3c67159407f0968fa4ce64fc990be51a9294b01456b"} Apr 16 14:05:28.261743 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:28.261714 2580 generic.go:358] "Generic (PLEG): container finished" podID="f44765a5-cf2e-4167-a0e9-e12d257427aa" containerID="c5bd88491b19aeac0f03974476aa09c5f33583350c388b343e28abf9ffbbaad9" exitCode=0 Apr 16 14:05:28.262139 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:28.261795 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" event={"ID":"f44765a5-cf2e-4167-a0e9-e12d257427aa","Type":"ContainerDied","Data":"c5bd88491b19aeac0f03974476aa09c5f33583350c388b343e28abf9ffbbaad9"} Apr 16 14:05:35.290545 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:35.290512 2580 generic.go:358] "Generic (PLEG): container finished" podID="f44765a5-cf2e-4167-a0e9-e12d257427aa" containerID="7e86cfdb53e5f595b426ad5771bb075a8da3bcd39ee03bbcbb9d001cd50c8ebc" exitCode=0 Apr 16 14:05:35.290911 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:35.290554 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" event={"ID":"f44765a5-cf2e-4167-a0e9-e12d257427aa","Type":"ContainerDied","Data":"7e86cfdb53e5f595b426ad5771bb075a8da3bcd39ee03bbcbb9d001cd50c8ebc"} Apr 16 14:05:36.425153 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:36.425128 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:36.589028 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:36.588936 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8krc\" (UniqueName: \"kubernetes.io/projected/f44765a5-cf2e-4167-a0e9-e12d257427aa-kube-api-access-g8krc\") pod \"f44765a5-cf2e-4167-a0e9-e12d257427aa\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " Apr 16 14:05:36.589028 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:36.588987 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-util\") pod \"f44765a5-cf2e-4167-a0e9-e12d257427aa\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " Apr 16 14:05:36.589028 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:36.589030 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-bundle\") pod \"f44765a5-cf2e-4167-a0e9-e12d257427aa\" (UID: \"f44765a5-cf2e-4167-a0e9-e12d257427aa\") " Apr 16 14:05:36.589704 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:36.589677 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-bundle" (OuterVolumeSpecName: "bundle") pod "f44765a5-cf2e-4167-a0e9-e12d257427aa" (UID: "f44765a5-cf2e-4167-a0e9-e12d257427aa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:05:36.591365 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:36.591340 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44765a5-cf2e-4167-a0e9-e12d257427aa-kube-api-access-g8krc" (OuterVolumeSpecName: "kube-api-access-g8krc") pod "f44765a5-cf2e-4167-a0e9-e12d257427aa" (UID: "f44765a5-cf2e-4167-a0e9-e12d257427aa"). InnerVolumeSpecName "kube-api-access-g8krc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:05:36.593609 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:36.593589 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-util" (OuterVolumeSpecName: "util") pod "f44765a5-cf2e-4167-a0e9-e12d257427aa" (UID: "f44765a5-cf2e-4167-a0e9-e12d257427aa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:05:36.689991 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:36.689949 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-util\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:05:36.689991 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:36.689979 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f44765a5-cf2e-4167-a0e9-e12d257427aa-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:05:36.689991 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:36.689989 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g8krc\" (UniqueName: \"kubernetes.io/projected/f44765a5-cf2e-4167-a0e9-e12d257427aa-kube-api-access-g8krc\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:05:37.297357 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:37.297322 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" event={"ID":"f44765a5-cf2e-4167-a0e9-e12d257427aa","Type":"ContainerDied","Data":"2ef04bd195c40ae40867cc61ea0c987cfcce342d15380a200587cffb590f5b75"} Apr 16 14:05:37.297357 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:37.297359 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ef04bd195c40ae40867cc61ea0c987cfcce342d15380a200587cffb590f5b75" Apr 16 14:05:37.297542 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:37.297387 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8vqfc" Apr 16 14:05:41.448094 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.448067 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk"] Apr 16 14:05:41.448508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.448358 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f44765a5-cf2e-4167-a0e9-e12d257427aa" containerName="extract" Apr 16 14:05:41.448508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.448370 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44765a5-cf2e-4167-a0e9-e12d257427aa" containerName="extract" Apr 16 14:05:41.448508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.448388 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f44765a5-cf2e-4167-a0e9-e12d257427aa" containerName="pull" Apr 16 14:05:41.448508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.448394 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44765a5-cf2e-4167-a0e9-e12d257427aa" containerName="pull" Apr 16 14:05:41.448508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.448405 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f44765a5-cf2e-4167-a0e9-e12d257427aa" containerName="util" Apr 16 14:05:41.448508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.448410 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44765a5-cf2e-4167-a0e9-e12d257427aa" containerName="util" Apr 16 14:05:41.448508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.448453 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="f44765a5-cf2e-4167-a0e9-e12d257427aa" containerName="extract" Apr 16 14:05:41.454971 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.454950 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" Apr 16 14:05:41.458857 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.458835 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-8xgll\"" Apr 16 14:05:41.459071 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.459051 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 14:05:41.459193 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.459098 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 14:05:41.459193 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.459099 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 14:05:41.474823 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.474797 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk"] Apr 16 14:05:41.630487 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.630446 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f46244f0-2457-4cfb-89d1-c1789282e6d3-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-glwtk\" (UID: \"f46244f0-2457-4cfb-89d1-c1789282e6d3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" Apr 16 14:05:41.630645 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.630500 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtc6t\" (UniqueName: \"kubernetes.io/projected/f46244f0-2457-4cfb-89d1-c1789282e6d3-kube-api-access-vtc6t\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-glwtk\" (UID: \"f46244f0-2457-4cfb-89d1-c1789282e6d3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" Apr 16 14:05:41.731825 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.731748 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f46244f0-2457-4cfb-89d1-c1789282e6d3-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-glwtk\" (UID: \"f46244f0-2457-4cfb-89d1-c1789282e6d3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" Apr 16 14:05:41.731825 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.731797 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtc6t\" (UniqueName: \"kubernetes.io/projected/f46244f0-2457-4cfb-89d1-c1789282e6d3-kube-api-access-vtc6t\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-glwtk\" (UID: \"f46244f0-2457-4cfb-89d1-c1789282e6d3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" Apr 16 14:05:41.734288 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.734245 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f46244f0-2457-4cfb-89d1-c1789282e6d3-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-glwtk\" (UID: \"f46244f0-2457-4cfb-89d1-c1789282e6d3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" Apr 16 14:05:41.741215 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.741193 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtc6t\" (UniqueName: \"kubernetes.io/projected/f46244f0-2457-4cfb-89d1-c1789282e6d3-kube-api-access-vtc6t\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-glwtk\" (UID: \"f46244f0-2457-4cfb-89d1-c1789282e6d3\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" Apr 16 14:05:41.765631 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.765607 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" Apr 16 14:05:41.894022 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:41.893985 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk"] Apr 16 14:05:41.897964 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:05:41.897932 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf46244f0_2457_4cfb_89d1_c1789282e6d3.slice/crio-4b3081cdfbbd2c267df5061b9652d845d416d601a5e0467aec85952e3b7eafea WatchSource:0}: Error finding container 4b3081cdfbbd2c267df5061b9652d845d416d601a5e0467aec85952e3b7eafea: Status 404 returned error can't find the container with id 4b3081cdfbbd2c267df5061b9652d845d416d601a5e0467aec85952e3b7eafea Apr 16 14:05:42.314293 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:42.314227 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" event={"ID":"f46244f0-2457-4cfb-89d1-c1789282e6d3","Type":"ContainerStarted","Data":"4b3081cdfbbd2c267df5061b9652d845d416d601a5e0467aec85952e3b7eafea"} Apr 16 14:05:46.155288 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.155240 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qngrl"] Apr 16 14:05:46.167449 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.167425 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qngrl"] Apr 16 14:05:46.167583 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.167533 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:46.169870 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.169848 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 14:05:46.169991 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.169878 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 14:05:46.169991 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.169907 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-vrr2h\"" Apr 16 14:05:46.266315 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.266277 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:46.266528 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.266337 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/4d1a7707-c069-436c-97d5-2df0e02a50f0-cabundle0\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:46.266528 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.266376 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpf5n\" (UniqueName: \"kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-kube-api-access-qpf5n\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:46.329175 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.329139 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" event={"ID":"f46244f0-2457-4cfb-89d1-c1789282e6d3","Type":"ContainerStarted","Data":"627e5182c723a2f78e16958a058383dd990d5d8a77bfacf72166b07ade131efe"} Apr 16 14:05:46.329380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.329369 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" Apr 16 14:05:46.348214 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.348148 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" podStartSLOduration=1.6641904589999998 podStartE2EDuration="5.348131038s" podCreationTimestamp="2026-04-16 14:05:41 +0000 UTC" firstStartedPulling="2026-04-16 14:05:41.900028287 +0000 UTC m=+365.336193996" lastFinishedPulling="2026-04-16 14:05:45.58396886 +0000 UTC m=+369.020134575" observedRunningTime="2026-04-16 14:05:46.347586862 +0000 UTC m=+369.783752592" watchObservedRunningTime="2026-04-16 14:05:46.348131038 +0000 UTC m=+369.784296767" Apr 16 14:05:46.367410 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.367371 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/4d1a7707-c069-436c-97d5-2df0e02a50f0-cabundle0\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:46.367621 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.367487 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpf5n\" (UniqueName: \"kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-kube-api-access-qpf5n\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:46.367621 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.367539 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:46.367744 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.367649 2580 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:05:46.367744 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.367664 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:05:46.367744 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.367675 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qngrl: references non-existent secret key: ca.crt Apr 16 14:05:46.367744 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.367735 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates podName:4d1a7707-c069-436c-97d5-2df0e02a50f0 nodeName:}" failed. No retries permitted until 2026-04-16 14:05:46.867716697 +0000 UTC m=+370.303882408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates") pod "keda-operator-ffbb595cb-qngrl" (UID: "4d1a7707-c069-436c-97d5-2df0e02a50f0") : references non-existent secret key: ca.crt Apr 16 14:05:46.368748 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.368716 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/4d1a7707-c069-436c-97d5-2df0e02a50f0-cabundle0\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:46.380431 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.380401 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpf5n\" (UniqueName: \"kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-kube-api-access-qpf5n\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:46.418311 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.418219 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz"] Apr 16 14:05:46.433048 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.433016 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz"] Apr 16 14:05:46.433214 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.433059 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:46.435538 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.435512 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 14:05:46.468164 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.468127 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptscz\" (UniqueName: \"kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-kube-api-access-ptscz\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:46.468401 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.468215 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8710db07-40ed-457e-98e3-9e3c70908992-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:46.468401 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.468244 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:46.568846 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.568815 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptscz\" (UniqueName: \"kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-kube-api-access-ptscz\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:46.569104 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.568880 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8710db07-40ed-457e-98e3-9e3c70908992-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:46.569104 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.568906 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:46.569104 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.569016 2580 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:05:46.569104 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.569032 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:05:46.569104 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.569049 2580 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 14:05:46.569104 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.569070 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 14:05:46.569372 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.569128 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates podName:8710db07-40ed-457e-98e3-9e3c70908992 nodeName:}" failed. No retries permitted until 2026-04-16 14:05:47.069109689 +0000 UTC m=+370.505275396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates") pod "keda-metrics-apiserver-7c9f485588-9jfzz" (UID: "8710db07-40ed-457e-98e3-9e3c70908992") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 14:05:46.569372 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.569314 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8710db07-40ed-457e-98e3-9e3c70908992-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:46.587174 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.587152 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptscz\" (UniqueName: \"kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-kube-api-access-ptscz\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:46.787737 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.787659 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-tmcgw"] Apr 16 14:05:46.810576 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.810544 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-tmcgw"] Apr 16 14:05:46.810720 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.810662 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-tmcgw" Apr 16 14:05:46.812987 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.812963 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 14:05:46.870765 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.870731 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/34e4bcec-7353-4a3b-829a-dd0c8b469038-certificates\") pod \"keda-admission-cf49989db-tmcgw\" (UID: \"34e4bcec-7353-4a3b-829a-dd0c8b469038\") " pod="openshift-keda/keda-admission-cf49989db-tmcgw" Apr 16 14:05:46.870924 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.870816 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:46.870924 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.870847 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpg28\" (UniqueName: \"kubernetes.io/projected/34e4bcec-7353-4a3b-829a-dd0c8b469038-kube-api-access-fpg28\") pod \"keda-admission-cf49989db-tmcgw\" (UID: \"34e4bcec-7353-4a3b-829a-dd0c8b469038\") " pod="openshift-keda/keda-admission-cf49989db-tmcgw" Apr 16 14:05:46.871038 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.870959 2580 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:05:46.871038 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.870979 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:05:46.871038 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.870988 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qngrl: references non-existent secret key: ca.crt Apr 16 14:05:46.871129 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:46.871044 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates podName:4d1a7707-c069-436c-97d5-2df0e02a50f0 nodeName:}" failed. No retries permitted until 2026-04-16 14:05:47.871026064 +0000 UTC m=+371.307191772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates") pod "keda-operator-ffbb595cb-qngrl" (UID: "4d1a7707-c069-436c-97d5-2df0e02a50f0") : references non-existent secret key: ca.crt Apr 16 14:05:46.971522 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.971490 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/34e4bcec-7353-4a3b-829a-dd0c8b469038-certificates\") pod \"keda-admission-cf49989db-tmcgw\" (UID: \"34e4bcec-7353-4a3b-829a-dd0c8b469038\") " pod="openshift-keda/keda-admission-cf49989db-tmcgw" Apr 16 14:05:46.971712 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.971579 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpg28\" (UniqueName: \"kubernetes.io/projected/34e4bcec-7353-4a3b-829a-dd0c8b469038-kube-api-access-fpg28\") pod \"keda-admission-cf49989db-tmcgw\" (UID: \"34e4bcec-7353-4a3b-829a-dd0c8b469038\") " pod="openshift-keda/keda-admission-cf49989db-tmcgw" Apr 16 14:05:46.974243 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.974212 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/34e4bcec-7353-4a3b-829a-dd0c8b469038-certificates\") pod \"keda-admission-cf49989db-tmcgw\" (UID: \"34e4bcec-7353-4a3b-829a-dd0c8b469038\") " pod="openshift-keda/keda-admission-cf49989db-tmcgw" Apr 16 14:05:46.979557 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:46.979525 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpg28\" (UniqueName: \"kubernetes.io/projected/34e4bcec-7353-4a3b-829a-dd0c8b469038-kube-api-access-fpg28\") pod \"keda-admission-cf49989db-tmcgw\" (UID: \"34e4bcec-7353-4a3b-829a-dd0c8b469038\") " pod="openshift-keda/keda-admission-cf49989db-tmcgw" Apr 16 14:05:47.072204 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:47.072112 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:47.072418 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:47.072295 2580 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:05:47.072418 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:47.072318 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:05:47.072418 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:47.072341 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz: references non-existent secret key: tls.crt Apr 16 14:05:47.072418 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:47.072406 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates podName:8710db07-40ed-457e-98e3-9e3c70908992 nodeName:}" failed. No retries permitted until 2026-04-16 14:05:48.072388227 +0000 UTC m=+371.508553933 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates") pod "keda-metrics-apiserver-7c9f485588-9jfzz" (UID: "8710db07-40ed-457e-98e3-9e3c70908992") : references non-existent secret key: tls.crt Apr 16 14:05:47.120425 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:47.120388 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-tmcgw" Apr 16 14:05:47.247351 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:47.247318 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-tmcgw"] Apr 16 14:05:47.251374 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:05:47.251343 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e4bcec_7353_4a3b_829a_dd0c8b469038.slice/crio-5cca2a1311fc7a4d5dfb4df9830c9630d8274690da1da73c12f62d127ccb8bbb WatchSource:0}: Error finding container 5cca2a1311fc7a4d5dfb4df9830c9630d8274690da1da73c12f62d127ccb8bbb: Status 404 returned error can't find the container with id 5cca2a1311fc7a4d5dfb4df9830c9630d8274690da1da73c12f62d127ccb8bbb Apr 16 14:05:47.333079 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:47.332990 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-tmcgw" event={"ID":"34e4bcec-7353-4a3b-829a-dd0c8b469038","Type":"ContainerStarted","Data":"5cca2a1311fc7a4d5dfb4df9830c9630d8274690da1da73c12f62d127ccb8bbb"} Apr 16 14:05:47.879835 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:47.879804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:47.879995 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:47.879951 2580 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:05:47.879995 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:47.879965 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:05:47.879995 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:47.879974 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qngrl: references non-existent secret key: ca.crt Apr 16 14:05:47.880102 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:47.880025 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates podName:4d1a7707-c069-436c-97d5-2df0e02a50f0 nodeName:}" failed. No retries permitted until 2026-04-16 14:05:49.880008369 +0000 UTC m=+373.316174076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates") pod "keda-operator-ffbb595cb-qngrl" (UID: "4d1a7707-c069-436c-97d5-2df0e02a50f0") : references non-existent secret key: ca.crt Apr 16 14:05:48.082218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:48.082178 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:48.082411 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:48.082335 2580 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:05:48.082411 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:48.082352 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:05:48.082411 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:48.082372 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz: references non-existent secret key: tls.crt Apr 16 14:05:48.082541 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:05:48.082426 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates podName:8710db07-40ed-457e-98e3-9e3c70908992 nodeName:}" failed. No retries permitted until 2026-04-16 14:05:50.082412919 +0000 UTC m=+373.518578625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates") pod "keda-metrics-apiserver-7c9f485588-9jfzz" (UID: "8710db07-40ed-457e-98e3-9e3c70908992") : references non-existent secret key: tls.crt Apr 16 14:05:49.897408 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:49.897372 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:49.900012 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:49.899988 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4d1a7707-c069-436c-97d5-2df0e02a50f0-certificates\") pod \"keda-operator-ffbb595cb-qngrl\" (UID: \"4d1a7707-c069-436c-97d5-2df0e02a50f0\") " pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:50.078580 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:50.078477 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:50.099647 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:50.099617 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:50.102386 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:50.102342 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8710db07-40ed-457e-98e3-9e3c70908992-certificates\") pod \"keda-metrics-apiserver-7c9f485588-9jfzz\" (UID: \"8710db07-40ed-457e-98e3-9e3c70908992\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:50.197204 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:50.197132 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qngrl"] Apr 16 14:05:50.199622 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:05:50.199588 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d1a7707_c069_436c_97d5_2df0e02a50f0.slice/crio-e71698aa96179b4dec9a06bb4233d4f28ec7f1f264aa0e1b2a93733bc1f721ec WatchSource:0}: Error finding container e71698aa96179b4dec9a06bb4233d4f28ec7f1f264aa0e1b2a93733bc1f721ec: Status 404 returned error can't find the container with id e71698aa96179b4dec9a06bb4233d4f28ec7f1f264aa0e1b2a93733bc1f721ec Apr 16 14:05:50.343770 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:50.343673 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:50.345229 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:50.345201 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-tmcgw" event={"ID":"34e4bcec-7353-4a3b-829a-dd0c8b469038","Type":"ContainerStarted","Data":"366ba1e97daabb6a558917f95a26954cc65660eb2c20d1256128937f3f1f7d74"} Apr 16 14:05:50.345349 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:50.345294 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-tmcgw" Apr 16 14:05:50.346314 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:50.346292 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-qngrl" event={"ID":"4d1a7707-c069-436c-97d5-2df0e02a50f0","Type":"ContainerStarted","Data":"e71698aa96179b4dec9a06bb4233d4f28ec7f1f264aa0e1b2a93733bc1f721ec"} Apr 16 14:05:50.362042 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:50.361996 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-tmcgw" podStartSLOduration=1.827912434 podStartE2EDuration="4.361980606s" podCreationTimestamp="2026-04-16 14:05:46 +0000 UTC" firstStartedPulling="2026-04-16 14:05:47.252740837 +0000 UTC m=+370.688906542" lastFinishedPulling="2026-04-16 14:05:49.786809005 +0000 UTC m=+373.222974714" observedRunningTime="2026-04-16 14:05:50.360224791 +0000 UTC m=+373.796390519" watchObservedRunningTime="2026-04-16 14:05:50.361980606 +0000 UTC m=+373.798146382" Apr 16 14:05:50.464805 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:50.464774 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz"] Apr 16 14:05:50.467666 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:05:50.467635 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8710db07_40ed_457e_98e3_9e3c70908992.slice/crio-3e6cec36a9a5e386487d0527aed9f22e0aaacffb691258f966e0431b6c28ea9e WatchSource:0}: Error finding container 3e6cec36a9a5e386487d0527aed9f22e0aaacffb691258f966e0431b6c28ea9e: Status 404 returned error can't find the container with id 3e6cec36a9a5e386487d0527aed9f22e0aaacffb691258f966e0431b6c28ea9e Apr 16 14:05:51.352656 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:51.352611 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" event={"ID":"8710db07-40ed-457e-98e3-9e3c70908992","Type":"ContainerStarted","Data":"3e6cec36a9a5e386487d0527aed9f22e0aaacffb691258f966e0431b6c28ea9e"} Apr 16 14:05:54.364406 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:54.364372 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-qngrl" event={"ID":"4d1a7707-c069-436c-97d5-2df0e02a50f0","Type":"ContainerStarted","Data":"1f134a9d8ade352b08d993636d44c052f7ad2bfe5f6baab01926a421ff251813"} Apr 16 14:05:54.364874 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:54.364440 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:05:54.365743 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:54.365718 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" event={"ID":"8710db07-40ed-457e-98e3-9e3c70908992","Type":"ContainerStarted","Data":"c8c846f5f1e9ed7a4bf6b94c779159cb6208878b382a261266a789f06395f4e0"} Apr 16 14:05:54.365890 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:54.365874 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:05:54.384207 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:54.384160 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-qngrl" podStartSLOduration=4.761909296 podStartE2EDuration="8.384148645s" podCreationTimestamp="2026-04-16 14:05:46 +0000 UTC" firstStartedPulling="2026-04-16 14:05:50.200980121 +0000 UTC m=+373.637145830" lastFinishedPulling="2026-04-16 14:05:53.823219473 +0000 UTC m=+377.259385179" observedRunningTime="2026-04-16 14:05:54.383381949 +0000 UTC m=+377.819547676" watchObservedRunningTime="2026-04-16 14:05:54.384148645 +0000 UTC m=+377.820314373" Apr 16 14:05:54.412037 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:05:54.411989 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" podStartSLOduration=5.0613145 podStartE2EDuration="8.411975283s" podCreationTimestamp="2026-04-16 14:05:46 +0000 UTC" firstStartedPulling="2026-04-16 14:05:50.468979991 +0000 UTC m=+373.905145696" lastFinishedPulling="2026-04-16 14:05:53.81964077 +0000 UTC m=+377.255806479" observedRunningTime="2026-04-16 14:05:54.410236862 +0000 UTC m=+377.846402589" watchObservedRunningTime="2026-04-16 14:05:54.411975283 +0000 UTC m=+377.848141011" Apr 16 14:06:05.374000 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:05.373969 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-9jfzz" Apr 16 14:06:07.335479 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:07.335452 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-glwtk" Apr 16 14:06:11.355091 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:11.355062 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-tmcgw" Apr 16 14:06:15.372500 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:15.372470 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-qngrl" Apr 16 14:06:39.567610 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.567573 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd"] Apr 16 14:06:39.575982 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.575961 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:39.579499 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.579120 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tl9st\"" Apr 16 14:06:39.579499 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.579153 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:06:39.579843 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.579823 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:06:39.581141 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.581117 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd"] Apr 16 14:06:39.707702 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.707665 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:39.707702 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.707704 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:39.707911 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.707728 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh9nv\" (UniqueName: \"kubernetes.io/projected/7061096b-a87b-4ee0-bc9d-e363d9961d0a-kube-api-access-dh9nv\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:39.809001 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.808952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:39.809001 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.809007 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:39.809189 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.809048 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dh9nv\" (UniqueName: \"kubernetes.io/projected/7061096b-a87b-4ee0-bc9d-e363d9961d0a-kube-api-access-dh9nv\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:39.809474 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.809455 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:39.809513 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.809461 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:39.817939 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.817869 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh9nv\" (UniqueName: \"kubernetes.io/projected/7061096b-a87b-4ee0-bc9d-e363d9961d0a-kube-api-access-dh9nv\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:39.886873 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:39.886835 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:40.013140 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:40.013115 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd"] Apr 16 14:06:40.015605 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:06:40.015580 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7061096b_a87b_4ee0_bc9d_e363d9961d0a.slice/crio-11020bc2beec4329afa82a437f057724ac81cc04b52b96dd2c78b6f58a4e459f WatchSource:0}: Error finding container 11020bc2beec4329afa82a437f057724ac81cc04b52b96dd2c78b6f58a4e459f: Status 404 returned error can't find the container with id 11020bc2beec4329afa82a437f057724ac81cc04b52b96dd2c78b6f58a4e459f Apr 16 14:06:40.515521 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:40.515482 2580 generic.go:358] "Generic (PLEG): container finished" podID="7061096b-a87b-4ee0-bc9d-e363d9961d0a" containerID="2471aee7b7dd515348e5b287d188a92eb4007a4a24a561cda019758dd2e9a0e6" exitCode=0 Apr 16 14:06:40.515692 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:40.515530 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" event={"ID":"7061096b-a87b-4ee0-bc9d-e363d9961d0a","Type":"ContainerDied","Data":"2471aee7b7dd515348e5b287d188a92eb4007a4a24a561cda019758dd2e9a0e6"} Apr 16 14:06:40.515692 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:40.515552 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" event={"ID":"7061096b-a87b-4ee0-bc9d-e363d9961d0a","Type":"ContainerStarted","Data":"11020bc2beec4329afa82a437f057724ac81cc04b52b96dd2c78b6f58a4e459f"} Apr 16 14:06:41.520641 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:41.520610 2580 generic.go:358] "Generic (PLEG): container finished" podID="7061096b-a87b-4ee0-bc9d-e363d9961d0a" containerID="a50c21516be57977cfa2f0b45ba91675244745a632e70d03a872422b4708661e" exitCode=0 Apr 16 14:06:41.521026 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:41.520705 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" event={"ID":"7061096b-a87b-4ee0-bc9d-e363d9961d0a","Type":"ContainerDied","Data":"a50c21516be57977cfa2f0b45ba91675244745a632e70d03a872422b4708661e"} Apr 16 14:06:42.525646 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:42.525617 2580 generic.go:358] "Generic (PLEG): container finished" podID="7061096b-a87b-4ee0-bc9d-e363d9961d0a" containerID="fe4b5176c9bc6e42dffd3d94613e004a013df6de97152f8811f4704274e7870e" exitCode=0 Apr 16 14:06:42.526041 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:42.525679 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" event={"ID":"7061096b-a87b-4ee0-bc9d-e363d9961d0a","Type":"ContainerDied","Data":"fe4b5176c9bc6e42dffd3d94613e004a013df6de97152f8811f4704274e7870e"} Apr 16 14:06:43.654050 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:43.654026 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:43.841022 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:43.840927 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh9nv\" (UniqueName: \"kubernetes.io/projected/7061096b-a87b-4ee0-bc9d-e363d9961d0a-kube-api-access-dh9nv\") pod \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " Apr 16 14:06:43.841022 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:43.840983 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-util\") pod \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " Apr 16 14:06:43.841022 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:43.841021 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-bundle\") pod \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\" (UID: \"7061096b-a87b-4ee0-bc9d-e363d9961d0a\") " Apr 16 14:06:43.841720 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:43.841693 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-bundle" (OuterVolumeSpecName: "bundle") pod "7061096b-a87b-4ee0-bc9d-e363d9961d0a" (UID: "7061096b-a87b-4ee0-bc9d-e363d9961d0a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:06:43.843309 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:43.843262 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7061096b-a87b-4ee0-bc9d-e363d9961d0a-kube-api-access-dh9nv" (OuterVolumeSpecName: "kube-api-access-dh9nv") pod "7061096b-a87b-4ee0-bc9d-e363d9961d0a" (UID: "7061096b-a87b-4ee0-bc9d-e363d9961d0a"). InnerVolumeSpecName "kube-api-access-dh9nv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:06:43.847352 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:43.847324 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-util" (OuterVolumeSpecName: "util") pod "7061096b-a87b-4ee0-bc9d-e363d9961d0a" (UID: "7061096b-a87b-4ee0-bc9d-e363d9961d0a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:06:43.941877 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:43.941821 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:06:43.941877 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:43.941870 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dh9nv\" (UniqueName: \"kubernetes.io/projected/7061096b-a87b-4ee0-bc9d-e363d9961d0a-kube-api-access-dh9nv\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:06:43.941877 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:43.941882 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7061096b-a87b-4ee0-bc9d-e363d9961d0a-util\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:06:44.533838 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:44.533805 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" Apr 16 14:06:44.534017 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:44.533805 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n4ddd" event={"ID":"7061096b-a87b-4ee0-bc9d-e363d9961d0a","Type":"ContainerDied","Data":"11020bc2beec4329afa82a437f057724ac81cc04b52b96dd2c78b6f58a4e459f"} Apr 16 14:06:44.534017 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:44.533918 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11020bc2beec4329afa82a437f057724ac81cc04b52b96dd2c78b6f58a4e459f" Apr 16 14:06:51.769293 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.769241 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw"] Apr 16 14:06:51.769772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.769562 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7061096b-a87b-4ee0-bc9d-e363d9961d0a" containerName="util" Apr 16 14:06:51.769772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.769576 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7061096b-a87b-4ee0-bc9d-e363d9961d0a" containerName="util" Apr 16 14:06:51.769772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.769593 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7061096b-a87b-4ee0-bc9d-e363d9961d0a" containerName="extract" Apr 16 14:06:51.769772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.769599 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7061096b-a87b-4ee0-bc9d-e363d9961d0a" containerName="extract" Apr 16 14:06:51.769772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.769617 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7061096b-a87b-4ee0-bc9d-e363d9961d0a" containerName="pull" Apr 16 14:06:51.769772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.769622 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7061096b-a87b-4ee0-bc9d-e363d9961d0a" containerName="pull" Apr 16 14:06:51.769772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.769671 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7061096b-a87b-4ee0-bc9d-e363d9961d0a" containerName="extract" Apr 16 14:06:51.773485 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.773467 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" Apr 16 14:06:51.776129 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.776106 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-v5wr7\"" Apr 16 14:06:51.776262 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.776144 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 14:06:51.776262 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.776167 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:06:51.789204 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.789176 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw"] Apr 16 14:06:51.802367 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.802322 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43e69d5f-afac-4808-9400-fb3db6654f6b-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-fbfjw\" (UID: \"43e69d5f-afac-4808-9400-fb3db6654f6b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" Apr 16 14:06:51.802518 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.802380 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qm29\" (UniqueName: \"kubernetes.io/projected/43e69d5f-afac-4808-9400-fb3db6654f6b-kube-api-access-9qm29\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-fbfjw\" (UID: \"43e69d5f-afac-4808-9400-fb3db6654f6b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" Apr 16 14:06:51.903004 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.902906 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qm29\" (UniqueName: \"kubernetes.io/projected/43e69d5f-afac-4808-9400-fb3db6654f6b-kube-api-access-9qm29\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-fbfjw\" (UID: \"43e69d5f-afac-4808-9400-fb3db6654f6b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" Apr 16 14:06:51.903196 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.903134 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43e69d5f-afac-4808-9400-fb3db6654f6b-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-fbfjw\" (UID: \"43e69d5f-afac-4808-9400-fb3db6654f6b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" Apr 16 14:06:51.903527 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.903507 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43e69d5f-afac-4808-9400-fb3db6654f6b-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-fbfjw\" (UID: \"43e69d5f-afac-4808-9400-fb3db6654f6b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" Apr 16 14:06:51.912684 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:51.912664 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qm29\" (UniqueName: \"kubernetes.io/projected/43e69d5f-afac-4808-9400-fb3db6654f6b-kube-api-access-9qm29\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-fbfjw\" (UID: \"43e69d5f-afac-4808-9400-fb3db6654f6b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" Apr 16 14:06:52.082972 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:52.082858 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" Apr 16 14:06:52.215204 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:52.215181 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw"] Apr 16 14:06:52.217359 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:06:52.217332 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e69d5f_afac_4808_9400_fb3db6654f6b.slice/crio-0361e92902bf7612584791dad4bf5c65043a72ccc9862411d4a7b88a7666ef86 WatchSource:0}: Error finding container 0361e92902bf7612584791dad4bf5c65043a72ccc9862411d4a7b88a7666ef86: Status 404 returned error can't find the container with id 0361e92902bf7612584791dad4bf5c65043a72ccc9862411d4a7b88a7666ef86 Apr 16 14:06:52.564848 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:52.564810 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" event={"ID":"43e69d5f-afac-4808-9400-fb3db6654f6b","Type":"ContainerStarted","Data":"0361e92902bf7612584791dad4bf5c65043a72ccc9862411d4a7b88a7666ef86"} Apr 16 14:06:56.583419 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:56.583373 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" event={"ID":"43e69d5f-afac-4808-9400-fb3db6654f6b","Type":"ContainerStarted","Data":"54ef87f5a5b8c2363c39e549ff99d690d979f73999c51c6e5abcdd00b5f025ff"} Apr 16 14:06:56.607303 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:06:56.607226 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-fbfjw" podStartSLOduration=2.305490799 podStartE2EDuration="5.607210346s" podCreationTimestamp="2026-04-16 14:06:51 +0000 UTC" firstStartedPulling="2026-04-16 14:06:52.220087065 +0000 UTC m=+435.656252774" lastFinishedPulling="2026-04-16 14:06:55.521806601 +0000 UTC m=+438.957972321" observedRunningTime="2026-04-16 14:06:56.605309229 +0000 UTC m=+440.041474956" watchObservedRunningTime="2026-04-16 14:06:56.607210346 +0000 UTC m=+440.043376074" Apr 16 14:07:01.881720 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:01.881679 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp"] Apr 16 14:07:01.885295 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:01.885256 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:01.887823 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:01.887803 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:07:01.888731 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:01.888715 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:07:01.888787 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:01.888743 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tl9st\"" Apr 16 14:07:01.894076 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:01.894050 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp"] Apr 16 14:07:01.975102 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:01.975062 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6hw5\" (UniqueName: \"kubernetes.io/projected/a24421dc-4e8a-4334-8aca-86d315dfde6d-kube-api-access-s6hw5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:01.975378 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:01.975117 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:01.975378 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:01.975138 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:02.075730 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.075692 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6hw5\" (UniqueName: \"kubernetes.io/projected/a24421dc-4e8a-4334-8aca-86d315dfde6d-kube-api-access-s6hw5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:02.075929 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.075749 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:02.075929 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.075777 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:02.076184 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.076163 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:02.076256 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.076182 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:02.084306 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.084246 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6hw5\" (UniqueName: \"kubernetes.io/projected/a24421dc-4e8a-4334-8aca-86d315dfde6d-kube-api-access-s6hw5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:02.194788 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.194701 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:02.348992 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.348969 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp"] Apr 16 14:07:02.351078 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:07:02.351049 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24421dc_4e8a_4334_8aca_86d315dfde6d.slice/crio-ab2f40089e63c245ebc64c6e0fef4c3a15fafd2a15de1dcf24c80cd634510e33 WatchSource:0}: Error finding container ab2f40089e63c245ebc64c6e0fef4c3a15fafd2a15de1dcf24c80cd634510e33: Status 404 returned error can't find the container with id ab2f40089e63c245ebc64c6e0fef4c3a15fafd2a15de1dcf24c80cd634510e33 Apr 16 14:07:02.606045 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.606015 2580 generic.go:358] "Generic (PLEG): container finished" podID="a24421dc-4e8a-4334-8aca-86d315dfde6d" containerID="e25058171a3fae5559ff339d6399f968499b68cf8a2047d31e3fdeaa64c529aa" exitCode=0 Apr 16 14:07:02.606204 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.606088 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" event={"ID":"a24421dc-4e8a-4334-8aca-86d315dfde6d","Type":"ContainerDied","Data":"e25058171a3fae5559ff339d6399f968499b68cf8a2047d31e3fdeaa64c529aa"} Apr 16 14:07:02.606204 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:02.606112 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" event={"ID":"a24421dc-4e8a-4334-8aca-86d315dfde6d","Type":"ContainerStarted","Data":"ab2f40089e63c245ebc64c6e0fef4c3a15fafd2a15de1dcf24c80cd634510e33"} Apr 16 14:07:05.623402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:05.623356 2580 generic.go:358] "Generic (PLEG): container finished" podID="a24421dc-4e8a-4334-8aca-86d315dfde6d" containerID="beda42c3fb51342ec7ed1b8cf7576381448bf96b85dbaf862449f02d43f39829" exitCode=0 Apr 16 14:07:05.623818 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:05.623422 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" event={"ID":"a24421dc-4e8a-4334-8aca-86d315dfde6d","Type":"ContainerDied","Data":"beda42c3fb51342ec7ed1b8cf7576381448bf96b85dbaf862449f02d43f39829"} Apr 16 14:07:06.628348 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:06.628308 2580 generic.go:358] "Generic (PLEG): container finished" podID="a24421dc-4e8a-4334-8aca-86d315dfde6d" containerID="c91da24632cc11e3acfa2620c0e91a7bb205502793ab859b58180a2642a21e25" exitCode=0 Apr 16 14:07:06.628726 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:06.628428 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" event={"ID":"a24421dc-4e8a-4334-8aca-86d315dfde6d","Type":"ContainerDied","Data":"c91da24632cc11e3acfa2620c0e91a7bb205502793ab859b58180a2642a21e25"} Apr 16 14:07:07.753094 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:07.753070 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:07.825241 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:07.825208 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6hw5\" (UniqueName: \"kubernetes.io/projected/a24421dc-4e8a-4334-8aca-86d315dfde6d-kube-api-access-s6hw5\") pod \"a24421dc-4e8a-4334-8aca-86d315dfde6d\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " Apr 16 14:07:07.825398 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:07.825315 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-bundle\") pod \"a24421dc-4e8a-4334-8aca-86d315dfde6d\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " Apr 16 14:07:07.825398 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:07.825350 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-util\") pod \"a24421dc-4e8a-4334-8aca-86d315dfde6d\" (UID: \"a24421dc-4e8a-4334-8aca-86d315dfde6d\") " Apr 16 14:07:07.825753 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:07.825718 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-bundle" (OuterVolumeSpecName: "bundle") pod "a24421dc-4e8a-4334-8aca-86d315dfde6d" (UID: "a24421dc-4e8a-4334-8aca-86d315dfde6d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:07:07.827573 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:07.827550 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24421dc-4e8a-4334-8aca-86d315dfde6d-kube-api-access-s6hw5" (OuterVolumeSpecName: "kube-api-access-s6hw5") pod "a24421dc-4e8a-4334-8aca-86d315dfde6d" (UID: "a24421dc-4e8a-4334-8aca-86d315dfde6d"). InnerVolumeSpecName "kube-api-access-s6hw5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:07:07.829972 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:07.829937 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-util" (OuterVolumeSpecName: "util") pod "a24421dc-4e8a-4334-8aca-86d315dfde6d" (UID: "a24421dc-4e8a-4334-8aca-86d315dfde6d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:07:07.926212 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:07.926109 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6hw5\" (UniqueName: \"kubernetes.io/projected/a24421dc-4e8a-4334-8aca-86d315dfde6d-kube-api-access-s6hw5\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:07.926212 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:07.926147 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:07.926212 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:07.926157 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a24421dc-4e8a-4334-8aca-86d315dfde6d-util\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:08.635856 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:08.635819 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" event={"ID":"a24421dc-4e8a-4334-8aca-86d315dfde6d","Type":"ContainerDied","Data":"ab2f40089e63c245ebc64c6e0fef4c3a15fafd2a15de1dcf24c80cd634510e33"} Apr 16 14:07:08.635856 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:08.635850 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab2f40089e63c245ebc64c6e0fef4c3a15fafd2a15de1dcf24c80cd634510e33" Apr 16 14:07:08.635856 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:08.635849 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fp8gjp" Apr 16 14:07:14.569256 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.569220 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cdfcfc699-dlkvn"] Apr 16 14:07:14.569653 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.569560 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a24421dc-4e8a-4334-8aca-86d315dfde6d" containerName="extract" Apr 16 14:07:14.569653 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.569572 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24421dc-4e8a-4334-8aca-86d315dfde6d" containerName="extract" Apr 16 14:07:14.569653 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.569594 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a24421dc-4e8a-4334-8aca-86d315dfde6d" containerName="pull" Apr 16 14:07:14.569653 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.569599 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24421dc-4e8a-4334-8aca-86d315dfde6d" containerName="pull" Apr 16 14:07:14.569653 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.569612 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a24421dc-4e8a-4334-8aca-86d315dfde6d" containerName="util" Apr 16 14:07:14.569653 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.569619 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24421dc-4e8a-4334-8aca-86d315dfde6d" containerName="util" Apr 16 14:07:14.569828 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.569670 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a24421dc-4e8a-4334-8aca-86d315dfde6d" containerName="extract" Apr 16 14:07:14.572665 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.572636 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.584144 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.584118 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cdfcfc699-dlkvn"] Apr 16 14:07:14.677182 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.677143 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-service-ca\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.677402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.677188 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-trusted-ca-bundle\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.677402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.677213 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-oauth-serving-cert\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.677402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.677346 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60be6dad-0fd9-4103-b4b3-7e33196de659-console-serving-cert\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.677402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.677373 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjg8z\" (UniqueName: \"kubernetes.io/projected/60be6dad-0fd9-4103-b4b3-7e33196de659-kube-api-access-xjg8z\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.677584 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.677403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60be6dad-0fd9-4103-b4b3-7e33196de659-console-oauth-config\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.677584 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.677459 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-console-config\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.778570 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.778530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60be6dad-0fd9-4103-b4b3-7e33196de659-console-serving-cert\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.778570 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.778570 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjg8z\" (UniqueName: \"kubernetes.io/projected/60be6dad-0fd9-4103-b4b3-7e33196de659-kube-api-access-xjg8z\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.778834 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.778594 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60be6dad-0fd9-4103-b4b3-7e33196de659-console-oauth-config\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.778834 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.778622 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-console-config\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.778834 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.778652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-service-ca\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.778834 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.778672 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-trusted-ca-bundle\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.778834 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.778686 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-oauth-serving-cert\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.779607 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.779581 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-console-config\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.779607 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.779600 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-oauth-serving-cert\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.779775 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.779579 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-service-ca\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.779953 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.779925 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60be6dad-0fd9-4103-b4b3-7e33196de659-trusted-ca-bundle\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.781352 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.781331 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60be6dad-0fd9-4103-b4b3-7e33196de659-console-serving-cert\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.781449 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.781354 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60be6dad-0fd9-4103-b4b3-7e33196de659-console-oauth-config\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.786890 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.786867 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjg8z\" (UniqueName: \"kubernetes.io/projected/60be6dad-0fd9-4103-b4b3-7e33196de659-kube-api-access-xjg8z\") pod \"console-7cdfcfc699-dlkvn\" (UID: \"60be6dad-0fd9-4103-b4b3-7e33196de659\") " pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:14.882670 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:14.882633 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:15.015030 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:15.015001 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cdfcfc699-dlkvn"] Apr 16 14:07:15.017223 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:07:15.017197 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60be6dad_0fd9_4103_b4b3_7e33196de659.slice/crio-1efa76d8677970000746c76e6f8b3bb7763071f53e28f603cbf99cc03e422017 WatchSource:0}: Error finding container 1efa76d8677970000746c76e6f8b3bb7763071f53e28f603cbf99cc03e422017: Status 404 returned error can't find the container with id 1efa76d8677970000746c76e6f8b3bb7763071f53e28f603cbf99cc03e422017 Apr 16 14:07:15.664768 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:15.664727 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cdfcfc699-dlkvn" event={"ID":"60be6dad-0fd9-4103-b4b3-7e33196de659","Type":"ContainerStarted","Data":"18060c3d9cacf3733daa540c2998a41fb52f5bf0ee0e251fe800b9a11a842796"} Apr 16 14:07:15.664768 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:15.664762 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cdfcfc699-dlkvn" event={"ID":"60be6dad-0fd9-4103-b4b3-7e33196de659","Type":"ContainerStarted","Data":"1efa76d8677970000746c76e6f8b3bb7763071f53e28f603cbf99cc03e422017"} Apr 16 14:07:15.683845 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:15.683726 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cdfcfc699-dlkvn" podStartSLOduration=1.683705857 podStartE2EDuration="1.683705857s" podCreationTimestamp="2026-04-16 14:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:07:15.682915679 +0000 UTC m=+459.119081409" watchObservedRunningTime="2026-04-16 14:07:15.683705857 +0000 UTC m=+459.119871585" Apr 16 14:07:24.882953 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:24.882916 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:24.882953 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:24.882963 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:24.887483 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:24.887459 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:25.703045 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:25.703022 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cdfcfc699-dlkvn" Apr 16 14:07:25.752195 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:25.752158 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-687949c9dd-r7xzt"] Apr 16 14:07:30.074890 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.074852 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz"] Apr 16 14:07:30.078310 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.078294 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.081188 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.081166 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:07:30.081322 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.081203 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:07:30.082153 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.082135 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tl9st\"" Apr 16 14:07:30.091139 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.087447 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz"] Apr 16 14:07:30.205286 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.205216 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.205286 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.205288 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfktf\" (UniqueName: \"kubernetes.io/projected/09d6400e-51b8-44bc-942d-32d707c35f49-kube-api-access-kfktf\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.205503 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.205315 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.306225 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.306185 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.306421 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.306236 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfktf\" (UniqueName: \"kubernetes.io/projected/09d6400e-51b8-44bc-942d-32d707c35f49-kube-api-access-kfktf\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.306421 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.306300 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.306610 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.306592 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.306669 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.306649 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.315714 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.315683 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfktf\" (UniqueName: \"kubernetes.io/projected/09d6400e-51b8-44bc-942d-32d707c35f49-kube-api-access-kfktf\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.393547 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.393516 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:30.522878 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.522855 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz"] Apr 16 14:07:30.524716 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:07:30.524685 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d6400e_51b8_44bc_942d_32d707c35f49.slice/crio-b2c9026c95a9a941c458a91d1c50b481f78c3674f65d0de9b1dcb71c315e4845 WatchSource:0}: Error finding container b2c9026c95a9a941c458a91d1c50b481f78c3674f65d0de9b1dcb71c315e4845: Status 404 returned error can't find the container with id b2c9026c95a9a941c458a91d1c50b481f78c3674f65d0de9b1dcb71c315e4845 Apr 16 14:07:30.718717 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.718624 2580 generic.go:358] "Generic (PLEG): container finished" podID="09d6400e-51b8-44bc-942d-32d707c35f49" containerID="257341ce1cf8c3b81a0795a45f165eed5222f5b09ac8d9c721a05c3ebbe665e3" exitCode=0 Apr 16 14:07:30.718908 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.718712 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" event={"ID":"09d6400e-51b8-44bc-942d-32d707c35f49","Type":"ContainerDied","Data":"257341ce1cf8c3b81a0795a45f165eed5222f5b09ac8d9c721a05c3ebbe665e3"} Apr 16 14:07:30.718908 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:30.718748 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" event={"ID":"09d6400e-51b8-44bc-942d-32d707c35f49","Type":"ContainerStarted","Data":"b2c9026c95a9a941c458a91d1c50b481f78c3674f65d0de9b1dcb71c315e4845"} Apr 16 14:07:32.727219 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:32.727186 2580 generic.go:358] "Generic (PLEG): container finished" podID="09d6400e-51b8-44bc-942d-32d707c35f49" containerID="8af64b0e2e14e65293ba422d705d561c361eda820d05cd145a18ef7a1d128c83" exitCode=0 Apr 16 14:07:32.727626 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:32.727296 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" event={"ID":"09d6400e-51b8-44bc-942d-32d707c35f49","Type":"ContainerDied","Data":"8af64b0e2e14e65293ba422d705d561c361eda820d05cd145a18ef7a1d128c83"} Apr 16 14:07:33.732569 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:33.732531 2580 generic.go:358] "Generic (PLEG): container finished" podID="09d6400e-51b8-44bc-942d-32d707c35f49" containerID="3a8a50fe184071976b2062de0cac1cefffd3d68ae8de0b3145400d9cd88b06f2" exitCode=0 Apr 16 14:07:33.732999 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:33.732617 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" event={"ID":"09d6400e-51b8-44bc-942d-32d707c35f49","Type":"ContainerDied","Data":"3a8a50fe184071976b2062de0cac1cefffd3d68ae8de0b3145400d9cd88b06f2"} Apr 16 14:07:34.860887 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:34.860866 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:34.946375 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:34.946342 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-bundle\") pod \"09d6400e-51b8-44bc-942d-32d707c35f49\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " Apr 16 14:07:34.946565 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:34.946423 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfktf\" (UniqueName: \"kubernetes.io/projected/09d6400e-51b8-44bc-942d-32d707c35f49-kube-api-access-kfktf\") pod \"09d6400e-51b8-44bc-942d-32d707c35f49\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " Apr 16 14:07:34.946565 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:34.946465 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-util\") pod \"09d6400e-51b8-44bc-942d-32d707c35f49\" (UID: \"09d6400e-51b8-44bc-942d-32d707c35f49\") " Apr 16 14:07:34.947326 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:34.947299 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-bundle" (OuterVolumeSpecName: "bundle") pod "09d6400e-51b8-44bc-942d-32d707c35f49" (UID: "09d6400e-51b8-44bc-942d-32d707c35f49"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:07:34.948683 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:34.948657 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d6400e-51b8-44bc-942d-32d707c35f49-kube-api-access-kfktf" (OuterVolumeSpecName: "kube-api-access-kfktf") pod "09d6400e-51b8-44bc-942d-32d707c35f49" (UID: "09d6400e-51b8-44bc-942d-32d707c35f49"). InnerVolumeSpecName "kube-api-access-kfktf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:07:34.952453 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:34.952423 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-util" (OuterVolumeSpecName: "util") pod "09d6400e-51b8-44bc-942d-32d707c35f49" (UID: "09d6400e-51b8-44bc-942d-32d707c35f49"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:07:35.047639 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:35.047552 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:35.047639 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:35.047581 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kfktf\" (UniqueName: \"kubernetes.io/projected/09d6400e-51b8-44bc-942d-32d707c35f49-kube-api-access-kfktf\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:35.047639 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:35.047594 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09d6400e-51b8-44bc-942d-32d707c35f49-util\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:35.741505 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:35.741474 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" Apr 16 14:07:35.741678 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:35.741507 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835g6vvz" event={"ID":"09d6400e-51b8-44bc-942d-32d707c35f49","Type":"ContainerDied","Data":"b2c9026c95a9a941c458a91d1c50b481f78c3674f65d0de9b1dcb71c315e4845"} Apr 16 14:07:35.741678 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:35.741542 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2c9026c95a9a941c458a91d1c50b481f78c3674f65d0de9b1dcb71c315e4845" Apr 16 14:07:44.846553 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.846514 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz"] Apr 16 14:07:44.846937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.846864 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09d6400e-51b8-44bc-942d-32d707c35f49" containerName="pull" Apr 16 14:07:44.846937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.846876 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d6400e-51b8-44bc-942d-32d707c35f49" containerName="pull" Apr 16 14:07:44.846937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.846887 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09d6400e-51b8-44bc-942d-32d707c35f49" containerName="util" Apr 16 14:07:44.846937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.846892 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d6400e-51b8-44bc-942d-32d707c35f49" containerName="util" Apr 16 14:07:44.846937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.846900 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09d6400e-51b8-44bc-942d-32d707c35f49" containerName="extract" Apr 16 14:07:44.846937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.846905 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d6400e-51b8-44bc-942d-32d707c35f49" containerName="extract" Apr 16 14:07:44.847129 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.846984 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="09d6400e-51b8-44bc-942d-32d707c35f49" containerName="extract" Apr 16 14:07:44.851397 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.851364 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:44.854376 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.854350 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:07:44.854523 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.854381 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:07:44.855309 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.855295 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tl9st\"" Apr 16 14:07:44.861971 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.861945 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz"] Apr 16 14:07:44.937293 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.937236 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt5js\" (UniqueName: \"kubernetes.io/projected/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-kube-api-access-qt5js\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:44.937467 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.937317 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:44.937467 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:44.937347 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:45.038466 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.038423 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt5js\" (UniqueName: \"kubernetes.io/projected/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-kube-api-access-qt5js\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:45.038466 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.038472 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:45.038694 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.038593 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:45.038868 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.038844 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:45.038933 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.038919 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:45.051839 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.051810 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt5js\" (UniqueName: \"kubernetes.io/projected/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-kube-api-access-qt5js\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:45.161915 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.161882 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:45.298004 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.297975 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz"] Apr 16 14:07:45.299970 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:07:45.299944 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa635a6_ebf2_4b2a_9950_fd4bc0e232ac.slice/crio-ed9d9327239e196d1c707b63337b04ddd17273f6f312f98964be6ae6f1c3ac3c WatchSource:0}: Error finding container ed9d9327239e196d1c707b63337b04ddd17273f6f312f98964be6ae6f1c3ac3c: Status 404 returned error can't find the container with id ed9d9327239e196d1c707b63337b04ddd17273f6f312f98964be6ae6f1c3ac3c Apr 16 14:07:45.777302 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.777233 2580 generic.go:358] "Generic (PLEG): container finished" podID="daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" containerID="94112383c349531a5749e44d0854649983928bfce3ced72f2d832b3b47c69ae3" exitCode=0 Apr 16 14:07:45.777477 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.777314 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" event={"ID":"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac","Type":"ContainerDied","Data":"94112383c349531a5749e44d0854649983928bfce3ced72f2d832b3b47c69ae3"} Apr 16 14:07:45.777477 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.777354 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" event={"ID":"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac","Type":"ContainerStarted","Data":"ed9d9327239e196d1c707b63337b04ddd17273f6f312f98964be6ae6f1c3ac3c"} Apr 16 14:07:45.970027 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.969999 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns"] Apr 16 14:07:45.973339 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.973321 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:45.975974 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.975951 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 14:07:45.976094 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.976026 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-9mh4m\"" Apr 16 14:07:45.976510 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.976488 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 14:07:45.976853 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.976837 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:07:45.976933 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.976871 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 14:07:45.977125 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.977112 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 14:07:45.991853 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:45.991817 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns"] Apr 16 14:07:46.047646 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.047554 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-cert\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.047646 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.047605 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-metrics-cert\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.047837 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.047701 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgb95\" (UniqueName: \"kubernetes.io/projected/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-kube-api-access-wgb95\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.047837 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.047733 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-manager-config\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.148439 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.148402 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgb95\" (UniqueName: \"kubernetes.io/projected/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-kube-api-access-wgb95\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.148607 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.148446 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-manager-config\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.148607 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.148503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-cert\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.148607 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.148543 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-metrics-cert\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.149101 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.149068 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-manager-config\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.151189 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.151166 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-metrics-cert\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.151189 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.151185 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-cert\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.168607 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.168580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgb95\" (UniqueName: \"kubernetes.io/projected/4272c25d-1e8a-4e1c-b621-21b6d0d7222e-kube-api-access-wgb95\") pod \"lws-controller-manager-5b8748f956-2rcns\" (UID: \"4272c25d-1e8a-4e1c-b621-21b6d0d7222e\") " pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.285859 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.285822 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:46.429965 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.429936 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns"] Apr 16 14:07:46.432480 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:07:46.432454 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4272c25d_1e8a_4e1c_b621_21b6d0d7222e.slice/crio-1af746c66b4bceb284961cdaf20be04200c84e6693761b7b397ba145de674ed5 WatchSource:0}: Error finding container 1af746c66b4bceb284961cdaf20be04200c84e6693761b7b397ba145de674ed5: Status 404 returned error can't find the container with id 1af746c66b4bceb284961cdaf20be04200c84e6693761b7b397ba145de674ed5 Apr 16 14:07:46.783408 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:46.783365 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" event={"ID":"4272c25d-1e8a-4e1c-b621-21b6d0d7222e","Type":"ContainerStarted","Data":"1af746c66b4bceb284961cdaf20be04200c84e6693761b7b397ba145de674ed5"} Apr 16 14:07:47.789199 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:47.789162 2580 generic.go:358] "Generic (PLEG): container finished" podID="daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" containerID="7fbdf9fa68288c694b4fd83edcd0f64ffd77f030d8598c4172401b61c308856b" exitCode=0 Apr 16 14:07:47.789689 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:47.789257 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" event={"ID":"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac","Type":"ContainerDied","Data":"7fbdf9fa68288c694b4fd83edcd0f64ffd77f030d8598c4172401b61c308856b"} Apr 16 14:07:48.794179 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:48.794140 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" event={"ID":"4272c25d-1e8a-4e1c-b621-21b6d0d7222e","Type":"ContainerStarted","Data":"4ce7859c123077086cb8d4253ad0c7f8ac03b949736f2e8d4324b7f4f865852f"} Apr 16 14:07:48.794649 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:48.794207 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:07:48.796038 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:48.796015 2580 generic.go:358] "Generic (PLEG): container finished" podID="daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" containerID="532c83a604d792d318999408f7981eb121461000a1d860f3701faf31cee7144b" exitCode=0 Apr 16 14:07:48.796111 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:48.796048 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" event={"ID":"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac","Type":"ContainerDied","Data":"532c83a604d792d318999408f7981eb121461000a1d860f3701faf31cee7144b"} Apr 16 14:07:48.812093 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:48.812022 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" podStartSLOduration=2.100482422 podStartE2EDuration="3.812005407s" podCreationTimestamp="2026-04-16 14:07:45 +0000 UTC" firstStartedPulling="2026-04-16 14:07:46.434375111 +0000 UTC m=+489.870540819" lastFinishedPulling="2026-04-16 14:07:48.145898095 +0000 UTC m=+491.582063804" observedRunningTime="2026-04-16 14:07:48.811458581 +0000 UTC m=+492.247624303" watchObservedRunningTime="2026-04-16 14:07:48.812005407 +0000 UTC m=+492.248171141" Apr 16 14:07:49.929125 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:49.929100 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:50.082989 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.082899 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-bundle\") pod \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " Apr 16 14:07:50.082989 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.082940 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-util\") pod \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " Apr 16 14:07:50.083219 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.083015 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt5js\" (UniqueName: \"kubernetes.io/projected/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-kube-api-access-qt5js\") pod \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\" (UID: \"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac\") " Apr 16 14:07:50.083931 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.083892 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-bundle" (OuterVolumeSpecName: "bundle") pod "daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" (UID: "daa635a6-ebf2-4b2a-9950-fd4bc0e232ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:07:50.085373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.085352 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-kube-api-access-qt5js" (OuterVolumeSpecName: "kube-api-access-qt5js") pod "daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" (UID: "daa635a6-ebf2-4b2a-9950-fd4bc0e232ac"). InnerVolumeSpecName "kube-api-access-qt5js". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:07:50.088628 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.088592 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-util" (OuterVolumeSpecName: "util") pod "daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" (UID: "daa635a6-ebf2-4b2a-9950-fd4bc0e232ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:07:50.183745 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.183708 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:50.183745 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.183737 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-util\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:50.183745 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.183747 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qt5js\" (UniqueName: \"kubernetes.io/projected/daa635a6-ebf2-4b2a-9950-fd4bc0e232ac-kube-api-access-qt5js\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:50.774921 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.774869 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-687949c9dd-r7xzt" podUID="6b0c7df2-31b3-464e-ba87-8ee223a03e65" containerName="console" containerID="cri-o://93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e" gracePeriod=15 Apr 16 14:07:50.805117 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.805087 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" event={"ID":"daa635a6-ebf2-4b2a-9950-fd4bc0e232ac","Type":"ContainerDied","Data":"ed9d9327239e196d1c707b63337b04ddd17273f6f312f98964be6ae6f1c3ac3c"} Apr 16 14:07:50.805248 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.805120 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed9d9327239e196d1c707b63337b04ddd17273f6f312f98964be6ae6f1c3ac3c" Apr 16 14:07:50.805248 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:50.805140 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25mnzz" Apr 16 14:07:51.017699 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.017680 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-687949c9dd-r7xzt_6b0c7df2-31b3-464e-ba87-8ee223a03e65/console/0.log" Apr 16 14:07:51.018021 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.017736 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:07:51.091595 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.091501 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-oauth-serving-cert\") pod \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " Apr 16 14:07:51.091595 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.091548 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-oauth-config\") pod \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " Apr 16 14:07:51.091817 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.091607 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-trusted-ca-bundle\") pod \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " Apr 16 14:07:51.091817 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.091636 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-config\") pod \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " Apr 16 14:07:51.091817 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.091659 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v24q6\" (UniqueName: \"kubernetes.io/projected/6b0c7df2-31b3-464e-ba87-8ee223a03e65-kube-api-access-v24q6\") pod \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " Apr 16 14:07:51.091817 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.091681 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-serving-cert\") pod \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " Apr 16 14:07:51.091817 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.091704 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-service-ca\") pod \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\" (UID: \"6b0c7df2-31b3-464e-ba87-8ee223a03e65\") " Apr 16 14:07:51.092130 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.092049 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-config" (OuterVolumeSpecName: "console-config") pod "6b0c7df2-31b3-464e-ba87-8ee223a03e65" (UID: "6b0c7df2-31b3-464e-ba87-8ee223a03e65"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:07:51.092130 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.092093 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6b0c7df2-31b3-464e-ba87-8ee223a03e65" (UID: "6b0c7df2-31b3-464e-ba87-8ee223a03e65"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:07:51.092130 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.092106 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-service-ca" (OuterVolumeSpecName: "service-ca") pod "6b0c7df2-31b3-464e-ba87-8ee223a03e65" (UID: "6b0c7df2-31b3-464e-ba87-8ee223a03e65"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:07:51.092307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.092190 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6b0c7df2-31b3-464e-ba87-8ee223a03e65" (UID: "6b0c7df2-31b3-464e-ba87-8ee223a03e65"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:07:51.094054 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.094027 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b0c7df2-31b3-464e-ba87-8ee223a03e65-kube-api-access-v24q6" (OuterVolumeSpecName: "kube-api-access-v24q6") pod "6b0c7df2-31b3-464e-ba87-8ee223a03e65" (UID: "6b0c7df2-31b3-464e-ba87-8ee223a03e65"). InnerVolumeSpecName "kube-api-access-v24q6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:07:51.094224 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.094207 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6b0c7df2-31b3-464e-ba87-8ee223a03e65" (UID: "6b0c7df2-31b3-464e-ba87-8ee223a03e65"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:07:51.094368 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.094348 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6b0c7df2-31b3-464e-ba87-8ee223a03e65" (UID: "6b0c7df2-31b3-464e-ba87-8ee223a03e65"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:07:51.192637 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.192609 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-trusted-ca-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:51.192637 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.192636 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-config\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:51.192637 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.192645 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v24q6\" (UniqueName: \"kubernetes.io/projected/6b0c7df2-31b3-464e-ba87-8ee223a03e65-kube-api-access-v24q6\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:51.192858 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.192655 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-serving-cert\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:51.192858 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.192664 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-service-ca\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:51.192858 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.192672 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b0c7df2-31b3-464e-ba87-8ee223a03e65-oauth-serving-cert\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:51.192858 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.192680 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b0c7df2-31b3-464e-ba87-8ee223a03e65-console-oauth-config\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:07:51.812614 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.812587 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-687949c9dd-r7xzt_6b0c7df2-31b3-464e-ba87-8ee223a03e65/console/0.log" Apr 16 14:07:51.812770 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.812630 2580 generic.go:358] "Generic (PLEG): container finished" podID="6b0c7df2-31b3-464e-ba87-8ee223a03e65" containerID="93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e" exitCode=2 Apr 16 14:07:51.812770 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.812662 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-687949c9dd-r7xzt" event={"ID":"6b0c7df2-31b3-464e-ba87-8ee223a03e65","Type":"ContainerDied","Data":"93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e"} Apr 16 14:07:51.812770 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.812703 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-687949c9dd-r7xzt" event={"ID":"6b0c7df2-31b3-464e-ba87-8ee223a03e65","Type":"ContainerDied","Data":"c41e73d8b039ec1b60b4142afda8e3628ff3bc3baf357267d1a6a7b290c791b7"} Apr 16 14:07:51.812770 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.812705 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-687949c9dd-r7xzt" Apr 16 14:07:51.812770 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.812716 2580 scope.go:117] "RemoveContainer" containerID="93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e" Apr 16 14:07:51.821402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.821384 2580 scope.go:117] "RemoveContainer" containerID="93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e" Apr 16 14:07:51.821692 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:07:51.821674 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e\": container with ID starting with 93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e not found: ID does not exist" containerID="93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e" Apr 16 14:07:51.821743 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.821702 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e"} err="failed to get container status \"93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e\": rpc error: code = NotFound desc = could not find container \"93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e\": container with ID starting with 93313274f96a132f10e8f35843a43c5a074084c057e309726e43e9c18850507e not found: ID does not exist" Apr 16 14:07:51.830159 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.830135 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-687949c9dd-r7xzt"] Apr 16 14:07:51.833661 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:51.833637 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-687949c9dd-r7xzt"] Apr 16 14:07:53.152999 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:53.152961 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b0c7df2-31b3-464e-ba87-8ee223a03e65" path="/var/lib/kubelet/pods/6b0c7df2-31b3-464e-ba87-8ee223a03e65/volumes" Apr 16 14:07:59.802375 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:07:59.802335 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5b8748f956-2rcns" Apr 16 14:08:12.242031 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.241995 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p"] Apr 16 14:08:12.242420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.242344 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" containerName="util" Apr 16 14:08:12.242420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.242356 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" containerName="util" Apr 16 14:08:12.242420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.242371 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" containerName="extract" Apr 16 14:08:12.242420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.242377 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" containerName="extract" Apr 16 14:08:12.242420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.242391 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" containerName="pull" Apr 16 14:08:12.242420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.242397 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" containerName="pull" Apr 16 14:08:12.242420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.242409 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b0c7df2-31b3-464e-ba87-8ee223a03e65" containerName="console" Apr 16 14:08:12.242420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.242414 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0c7df2-31b3-464e-ba87-8ee223a03e65" containerName="console" Apr 16 14:08:12.242695 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.242468 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="daa635a6-ebf2-4b2a-9950-fd4bc0e232ac" containerName="extract" Apr 16 14:08:12.242695 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.242477 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b0c7df2-31b3-464e-ba87-8ee223a03e65" containerName="console" Apr 16 14:08:12.246257 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.246239 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.248866 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.248845 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:08:12.249663 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.249650 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tl9st\"" Apr 16 14:08:12.249734 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.249672 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:08:12.253806 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.253783 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p"] Apr 16 14:08:12.275874 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.275846 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.276023 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.275922 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qmzx\" (UniqueName: \"kubernetes.io/projected/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-kube-api-access-2qmzx\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.276023 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.275963 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.341197 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.341165 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7"] Apr 16 14:08:12.344803 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.344787 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.356196 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.356167 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7"] Apr 16 14:08:12.377313 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.377254 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.377478 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.377336 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.377478 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.377370 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qmzx\" (UniqueName: \"kubernetes.io/projected/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-kube-api-access-2qmzx\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.377478 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.377393 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.377478 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.377442 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrds6\" (UniqueName: \"kubernetes.io/projected/898e9c4d-742c-4786-837b-8d440551720d-kube-api-access-xrds6\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.377684 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.377485 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.377684 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.377640 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.377770 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.377692 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.386607 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.386580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qmzx\" (UniqueName: \"kubernetes.io/projected/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-kube-api-access-2qmzx\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.449176 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.449144 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr"] Apr 16 14:08:12.453344 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.453327 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.461753 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.461731 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr"] Apr 16 14:08:12.478183 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.478146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.478183 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.478186 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.478431 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.478251 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.478431 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.478312 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrds6\" (UniqueName: \"kubernetes.io/projected/898e9c4d-742c-4786-837b-8d440551720d-kube-api-access-xrds6\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.478431 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.478377 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vc57\" (UniqueName: \"kubernetes.io/projected/3138a678-a10b-491b-a93d-627060e56cfd-kube-api-access-6vc57\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.478431 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.478406 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.478577 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.478510 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.478615 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.478597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.487527 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.487501 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrds6\" (UniqueName: \"kubernetes.io/projected/898e9c4d-742c-4786-837b-8d440551720d-kube-api-access-xrds6\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.553222 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.553141 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf"] Apr 16 14:08:12.556913 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.556894 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.557037 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.556916 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:12.572788 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.572757 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf"] Apr 16 14:08:12.579412 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.579370 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.579561 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.579441 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vc57\" (UniqueName: \"kubernetes.io/projected/3138a678-a10b-491b-a93d-627060e56cfd-kube-api-access-6vc57\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.579561 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.579472 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.579561 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.579514 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.579561 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.579542 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khg8w\" (UniqueName: \"kubernetes.io/projected/5f2718f7-100d-4eb9-95e7-c0876b684457-kube-api-access-khg8w\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.579752 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.579596 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.580037 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.580015 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.580092 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.580068 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.590971 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.590940 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vc57\" (UniqueName: \"kubernetes.io/projected/3138a678-a10b-491b-a93d-627060e56cfd-kube-api-access-6vc57\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.653713 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.653675 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:12.681063 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.681027 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.681250 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.681164 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.681250 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.681206 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khg8w\" (UniqueName: \"kubernetes.io/projected/5f2718f7-100d-4eb9-95e7-c0876b684457-kube-api-access-khg8w\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.681484 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.681461 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.681532 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.681507 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.693281 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.693230 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khg8w\" (UniqueName: \"kubernetes.io/projected/5f2718f7-100d-4eb9-95e7-c0876b684457-kube-api-access-khg8w\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.700526 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.700492 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p"] Apr 16 14:08:12.702299 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:08:12.702248 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a7a8c2_0fb0_40cf_9149_e1f9476ea6f1.slice/crio-f654acc4275160e650c1a77e811fc2f654c719578d9ac4bc212ea44f19a8a2c6 WatchSource:0}: Error finding container f654acc4275160e650c1a77e811fc2f654c719578d9ac4bc212ea44f19a8a2c6: Status 404 returned error can't find the container with id f654acc4275160e650c1a77e811fc2f654c719578d9ac4bc212ea44f19a8a2c6 Apr 16 14:08:12.763820 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.763790 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:12.801950 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.801923 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7"] Apr 16 14:08:12.803187 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:08:12.803139 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod898e9c4d_742c_4786_837b_8d440551720d.slice/crio-1352605b80a90314e4a0970483e2ac5fe7090f40520f54405341b380dce0da65 WatchSource:0}: Error finding container 1352605b80a90314e4a0970483e2ac5fe7090f40520f54405341b380dce0da65: Status 404 returned error can't find the container with id 1352605b80a90314e4a0970483e2ac5fe7090f40520f54405341b380dce0da65 Apr 16 14:08:12.888721 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.888691 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:12.889727 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.889700 2580 generic.go:358] "Generic (PLEG): container finished" podID="14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" containerID="534dcf1e33a823598bdb37e98e4325b9791065564a2eab39caedabc1294737a2" exitCode=0 Apr 16 14:08:12.889857 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.889767 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" event={"ID":"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1","Type":"ContainerDied","Data":"534dcf1e33a823598bdb37e98e4325b9791065564a2eab39caedabc1294737a2"} Apr 16 14:08:12.889857 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.889813 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" event={"ID":"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1","Type":"ContainerStarted","Data":"f654acc4275160e650c1a77e811fc2f654c719578d9ac4bc212ea44f19a8a2c6"} Apr 16 14:08:12.890959 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.890935 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" event={"ID":"898e9c4d-742c-4786-837b-8d440551720d","Type":"ContainerStarted","Data":"1352605b80a90314e4a0970483e2ac5fe7090f40520f54405341b380dce0da65"} Apr 16 14:08:12.926644 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:12.926603 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr"] Apr 16 14:08:12.930749 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:08:12.930651 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3138a678_a10b_491b_a93d_627060e56cfd.slice/crio-779dd07fb11b47c3e5c3865f115894fc7960ba00dbc101e5e4ee7677e74623b1 WatchSource:0}: Error finding container 779dd07fb11b47c3e5c3865f115894fc7960ba00dbc101e5e4ee7677e74623b1: Status 404 returned error can't find the container with id 779dd07fb11b47c3e5c3865f115894fc7960ba00dbc101e5e4ee7677e74623b1 Apr 16 14:08:13.028202 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:13.028180 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf"] Apr 16 14:08:13.030468 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:08:13.030442 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f2718f7_100d_4eb9_95e7_c0876b684457.slice/crio-2c25137102c78df3c19965667bb6743686a1354dd8916e2123cbd063eab53844 WatchSource:0}: Error finding container 2c25137102c78df3c19965667bb6743686a1354dd8916e2123cbd063eab53844: Status 404 returned error can't find the container with id 2c25137102c78df3c19965667bb6743686a1354dd8916e2123cbd063eab53844 Apr 16 14:08:13.895382 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:13.895340 2580 generic.go:358] "Generic (PLEG): container finished" podID="898e9c4d-742c-4786-837b-8d440551720d" containerID="301029465f49be80aba869b6e45ad307d19b8ae73b0027b482c8f03467fd634e" exitCode=0 Apr 16 14:08:13.895814 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:13.895421 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" event={"ID":"898e9c4d-742c-4786-837b-8d440551720d","Type":"ContainerDied","Data":"301029465f49be80aba869b6e45ad307d19b8ae73b0027b482c8f03467fd634e"} Apr 16 14:08:13.896924 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:13.896877 2580 generic.go:358] "Generic (PLEG): container finished" podID="5f2718f7-100d-4eb9-95e7-c0876b684457" containerID="a3f1c1eed4679bbedcf867f14a999031071281aaa50b58e248bb20b87eed588f" exitCode=0 Apr 16 14:08:13.896983 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:13.896960 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" event={"ID":"5f2718f7-100d-4eb9-95e7-c0876b684457","Type":"ContainerDied","Data":"a3f1c1eed4679bbedcf867f14a999031071281aaa50b58e248bb20b87eed588f"} Apr 16 14:08:13.897039 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:13.896994 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" event={"ID":"5f2718f7-100d-4eb9-95e7-c0876b684457","Type":"ContainerStarted","Data":"2c25137102c78df3c19965667bb6743686a1354dd8916e2123cbd063eab53844"} Apr 16 14:08:13.898377 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:13.898357 2580 generic.go:358] "Generic (PLEG): container finished" podID="3138a678-a10b-491b-a93d-627060e56cfd" containerID="74426d967e1acb3dc1aab7944d9887fa836f8b285da4dba4651a3a2dec03e31d" exitCode=0 Apr 16 14:08:13.898489 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:13.898435 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" event={"ID":"3138a678-a10b-491b-a93d-627060e56cfd","Type":"ContainerDied","Data":"74426d967e1acb3dc1aab7944d9887fa836f8b285da4dba4651a3a2dec03e31d"} Apr 16 14:08:13.898489 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:13.898457 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" event={"ID":"3138a678-a10b-491b-a93d-627060e56cfd","Type":"ContainerStarted","Data":"779dd07fb11b47c3e5c3865f115894fc7960ba00dbc101e5e4ee7677e74623b1"} Apr 16 14:08:14.904121 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:14.904087 2580 generic.go:358] "Generic (PLEG): container finished" podID="14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" containerID="2795ee250a9ad58be6222a27a6a6abd654f9c6790cf491bd65e3e7df7a5a954b" exitCode=0 Apr 16 14:08:14.904488 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:14.904213 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" event={"ID":"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1","Type":"ContainerDied","Data":"2795ee250a9ad58be6222a27a6a6abd654f9c6790cf491bd65e3e7df7a5a954b"} Apr 16 14:08:15.909085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:15.909053 2580 generic.go:358] "Generic (PLEG): container finished" podID="898e9c4d-742c-4786-837b-8d440551720d" containerID="2247ae4932b3c3e6f98dac740b675fc9c12d6cb61a61d9254549e7cece55aadf" exitCode=0 Apr 16 14:08:15.909581 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:15.909146 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" event={"ID":"898e9c4d-742c-4786-837b-8d440551720d","Type":"ContainerDied","Data":"2247ae4932b3c3e6f98dac740b675fc9c12d6cb61a61d9254549e7cece55aadf"} Apr 16 14:08:15.910917 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:15.910839 2580 generic.go:358] "Generic (PLEG): container finished" podID="5f2718f7-100d-4eb9-95e7-c0876b684457" containerID="dfee8591527a0c200580a32062f0c61804a511f19a733bb3e9a9e7a230046295" exitCode=0 Apr 16 14:08:15.910989 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:15.910951 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" event={"ID":"5f2718f7-100d-4eb9-95e7-c0876b684457","Type":"ContainerDied","Data":"dfee8591527a0c200580a32062f0c61804a511f19a733bb3e9a9e7a230046295"} Apr 16 14:08:15.912575 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:15.912553 2580 generic.go:358] "Generic (PLEG): container finished" podID="3138a678-a10b-491b-a93d-627060e56cfd" containerID="8af46cf38fd5201332f90dfce97c1ee3734cd9428e645b7ff3f069eb2c74986e" exitCode=0 Apr 16 14:08:15.912663 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:15.912604 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" event={"ID":"3138a678-a10b-491b-a93d-627060e56cfd","Type":"ContainerDied","Data":"8af46cf38fd5201332f90dfce97c1ee3734cd9428e645b7ff3f069eb2c74986e"} Apr 16 14:08:15.918567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:15.918527 2580 generic.go:358] "Generic (PLEG): container finished" podID="14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" containerID="e5b5b7e0d6051e0daa1ab370aa63d4ef53227a853ce856aef7ec3751de88151c" exitCode=0 Apr 16 14:08:15.918674 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:15.918587 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" event={"ID":"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1","Type":"ContainerDied","Data":"e5b5b7e0d6051e0daa1ab370aa63d4ef53227a853ce856aef7ec3751de88151c"} Apr 16 14:08:16.923732 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:16.923696 2580 generic.go:358] "Generic (PLEG): container finished" podID="5f2718f7-100d-4eb9-95e7-c0876b684457" containerID="2987578dff71f6bf62967c5427251c5bc0be3ba70aabbd1cb26331bad05ea738" exitCode=0 Apr 16 14:08:16.924155 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:16.923774 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" event={"ID":"5f2718f7-100d-4eb9-95e7-c0876b684457","Type":"ContainerDied","Data":"2987578dff71f6bf62967c5427251c5bc0be3ba70aabbd1cb26331bad05ea738"} Apr 16 14:08:16.925545 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:16.925522 2580 generic.go:358] "Generic (PLEG): container finished" podID="3138a678-a10b-491b-a93d-627060e56cfd" containerID="b3b69fa17e09d285d9c1b96eb151ae68660b990b1ecd5967ee689fbc1e5266f4" exitCode=0 Apr 16 14:08:16.925662 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:16.925613 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" event={"ID":"3138a678-a10b-491b-a93d-627060e56cfd","Type":"ContainerDied","Data":"b3b69fa17e09d285d9c1b96eb151ae68660b990b1ecd5967ee689fbc1e5266f4"} Apr 16 14:08:16.927344 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:16.927322 2580 generic.go:358] "Generic (PLEG): container finished" podID="898e9c4d-742c-4786-837b-8d440551720d" containerID="490068eb7534b2c7ea1953c6f123dc4faf3e58737e59a68da7115d84fca72e28" exitCode=0 Apr 16 14:08:16.927440 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:16.927401 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" event={"ID":"898e9c4d-742c-4786-837b-8d440551720d","Type":"ContainerDied","Data":"490068eb7534b2c7ea1953c6f123dc4faf3e58737e59a68da7115d84fca72e28"} Apr 16 14:08:17.051660 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.051638 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:17.120821 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.120731 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-bundle\") pod \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " Apr 16 14:08:17.120821 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.120774 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-util\") pod \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " Apr 16 14:08:17.120821 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.120801 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qmzx\" (UniqueName: \"kubernetes.io/projected/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-kube-api-access-2qmzx\") pod \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\" (UID: \"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1\") " Apr 16 14:08:17.121240 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.121206 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-bundle" (OuterVolumeSpecName: "bundle") pod "14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" (UID: "14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:17.123089 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.123061 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-kube-api-access-2qmzx" (OuterVolumeSpecName: "kube-api-access-2qmzx") pod "14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" (UID: "14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1"). InnerVolumeSpecName "kube-api-access-2qmzx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:08:17.126401 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.126375 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-util" (OuterVolumeSpecName: "util") pod "14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" (UID: "14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:17.222020 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.221985 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:17.222020 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.222016 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-util\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:17.222020 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.222025 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2qmzx\" (UniqueName: \"kubernetes.io/projected/14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1-kube-api-access-2qmzx\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:17.932474 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.932443 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" Apr 16 14:08:17.932474 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.932465 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bfk64p" event={"ID":"14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1","Type":"ContainerDied","Data":"f654acc4275160e650c1a77e811fc2f654c719578d9ac4bc212ea44f19a8a2c6"} Apr 16 14:08:17.932923 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:17.932497 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f654acc4275160e650c1a77e811fc2f654c719578d9ac4bc212ea44f19a8a2c6" Apr 16 14:08:18.079945 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.079923 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:18.118868 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.118847 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:18.122064 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.122043 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:18.129581 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.129561 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-util\") pod \"3138a678-a10b-491b-a93d-627060e56cfd\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " Apr 16 14:08:18.129669 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.129593 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vc57\" (UniqueName: \"kubernetes.io/projected/3138a678-a10b-491b-a93d-627060e56cfd-kube-api-access-6vc57\") pod \"3138a678-a10b-491b-a93d-627060e56cfd\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " Apr 16 14:08:18.129703 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.129668 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-bundle\") pod \"3138a678-a10b-491b-a93d-627060e56cfd\" (UID: \"3138a678-a10b-491b-a93d-627060e56cfd\") " Apr 16 14:08:18.130152 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.130128 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-bundle" (OuterVolumeSpecName: "bundle") pod "3138a678-a10b-491b-a93d-627060e56cfd" (UID: "3138a678-a10b-491b-a93d-627060e56cfd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:18.131883 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.131862 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3138a678-a10b-491b-a93d-627060e56cfd-kube-api-access-6vc57" (OuterVolumeSpecName: "kube-api-access-6vc57") pod "3138a678-a10b-491b-a93d-627060e56cfd" (UID: "3138a678-a10b-491b-a93d-627060e56cfd"). InnerVolumeSpecName "kube-api-access-6vc57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:08:18.135050 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.135031 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-util" (OuterVolumeSpecName: "util") pod "3138a678-a10b-491b-a93d-627060e56cfd" (UID: "3138a678-a10b-491b-a93d-627060e56cfd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:18.230492 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.230407 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrds6\" (UniqueName: \"kubernetes.io/projected/898e9c4d-742c-4786-837b-8d440551720d-kube-api-access-xrds6\") pod \"898e9c4d-742c-4786-837b-8d440551720d\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " Apr 16 14:08:18.230492 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.230451 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-util\") pod \"5f2718f7-100d-4eb9-95e7-c0876b684457\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " Apr 16 14:08:18.230492 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.230474 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-bundle\") pod \"898e9c4d-742c-4786-837b-8d440551720d\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " Apr 16 14:08:18.230758 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.230506 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-bundle\") pod \"5f2718f7-100d-4eb9-95e7-c0876b684457\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " Apr 16 14:08:18.230758 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.230537 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-util\") pod \"898e9c4d-742c-4786-837b-8d440551720d\" (UID: \"898e9c4d-742c-4786-837b-8d440551720d\") " Apr 16 14:08:18.230758 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.230566 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khg8w\" (UniqueName: \"kubernetes.io/projected/5f2718f7-100d-4eb9-95e7-c0876b684457-kube-api-access-khg8w\") pod \"5f2718f7-100d-4eb9-95e7-c0876b684457\" (UID: \"5f2718f7-100d-4eb9-95e7-c0876b684457\") " Apr 16 14:08:18.230895 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.230800 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-util\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:18.230895 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.230820 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vc57\" (UniqueName: \"kubernetes.io/projected/3138a678-a10b-491b-a93d-627060e56cfd-kube-api-access-6vc57\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:18.230895 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.230836 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3138a678-a10b-491b-a93d-627060e56cfd-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:18.231318 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.231289 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-bundle" (OuterVolumeSpecName: "bundle") pod "898e9c4d-742c-4786-837b-8d440551720d" (UID: "898e9c4d-742c-4786-837b-8d440551720d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:18.231696 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.231667 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-bundle" (OuterVolumeSpecName: "bundle") pod "5f2718f7-100d-4eb9-95e7-c0876b684457" (UID: "5f2718f7-100d-4eb9-95e7-c0876b684457"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:18.233564 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.233538 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/898e9c4d-742c-4786-837b-8d440551720d-kube-api-access-xrds6" (OuterVolumeSpecName: "kube-api-access-xrds6") pod "898e9c4d-742c-4786-837b-8d440551720d" (UID: "898e9c4d-742c-4786-837b-8d440551720d"). InnerVolumeSpecName "kube-api-access-xrds6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:08:18.233564 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.233546 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f2718f7-100d-4eb9-95e7-c0876b684457-kube-api-access-khg8w" (OuterVolumeSpecName: "kube-api-access-khg8w") pod "5f2718f7-100d-4eb9-95e7-c0876b684457" (UID: "5f2718f7-100d-4eb9-95e7-c0876b684457"). InnerVolumeSpecName "kube-api-access-khg8w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:08:18.236059 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.236026 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-util" (OuterVolumeSpecName: "util") pod "5f2718f7-100d-4eb9-95e7-c0876b684457" (UID: "5f2718f7-100d-4eb9-95e7-c0876b684457"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:18.236638 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.236619 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-util" (OuterVolumeSpecName: "util") pod "898e9c4d-742c-4786-837b-8d440551720d" (UID: "898e9c4d-742c-4786-837b-8d440551720d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:18.331349 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.331312 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:18.331349 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.331339 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-util\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:18.331349 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.331348 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-khg8w\" (UniqueName: \"kubernetes.io/projected/5f2718f7-100d-4eb9-95e7-c0876b684457-kube-api-access-khg8w\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:18.331349 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.331359 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xrds6\" (UniqueName: \"kubernetes.io/projected/898e9c4d-742c-4786-837b-8d440551720d-kube-api-access-xrds6\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:18.331611 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.331368 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f2718f7-100d-4eb9-95e7-c0876b684457-util\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:18.331611 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.331386 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/898e9c4d-742c-4786-837b-8d440551720d-bundle\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:08:18.938414 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.938375 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" event={"ID":"898e9c4d-742c-4786-837b-8d440551720d","Type":"ContainerDied","Data":"1352605b80a90314e4a0970483e2ac5fe7090f40520f54405341b380dce0da65"} Apr 16 14:08:18.938414 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.938412 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1352605b80a90314e4a0970483e2ac5fe7090f40520f54405341b380dce0da65" Apr 16 14:08:18.938414 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.938418 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30dmfg7" Apr 16 14:08:18.940194 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.940170 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" Apr 16 14:08:18.940356 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.940169 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lltmf" event={"ID":"5f2718f7-100d-4eb9-95e7-c0876b684457","Type":"ContainerDied","Data":"2c25137102c78df3c19965667bb6743686a1354dd8916e2123cbd063eab53844"} Apr 16 14:08:18.940356 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.940303 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c25137102c78df3c19965667bb6743686a1354dd8916e2123cbd063eab53844" Apr 16 14:08:18.941930 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.941909 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" event={"ID":"3138a678-a10b-491b-a93d-627060e56cfd","Type":"ContainerDied","Data":"779dd07fb11b47c3e5c3865f115894fc7960ba00dbc101e5e4ee7677e74623b1"} Apr 16 14:08:18.941930 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.941931 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="779dd07fb11b47c3e5c3865f115894fc7960ba00dbc101e5e4ee7677e74623b1" Apr 16 14:08:18.942083 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:18.941979 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88xmjcr" Apr 16 14:08:31.204681 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.204596 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q"] Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205103 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3138a678-a10b-491b-a93d-627060e56cfd" containerName="extract" Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205122 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3138a678-a10b-491b-a93d-627060e56cfd" containerName="extract" Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205136 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3138a678-a10b-491b-a93d-627060e56cfd" containerName="util" Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205145 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3138a678-a10b-491b-a93d-627060e56cfd" containerName="util" Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205155 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" containerName="pull" Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205163 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" containerName="pull" Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205178 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3138a678-a10b-491b-a93d-627060e56cfd" containerName="pull" Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205186 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3138a678-a10b-491b-a93d-627060e56cfd" containerName="pull" Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205205 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" containerName="util" Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205212 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" containerName="util" Apr 16 14:08:31.205218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205221 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="898e9c4d-742c-4786-837b-8d440551720d" containerName="util" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205229 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="898e9c4d-742c-4786-837b-8d440551720d" containerName="util" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205239 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f2718f7-100d-4eb9-95e7-c0876b684457" containerName="extract" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205246 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2718f7-100d-4eb9-95e7-c0876b684457" containerName="extract" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205255 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="898e9c4d-742c-4786-837b-8d440551720d" containerName="pull" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205264 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="898e9c4d-742c-4786-837b-8d440551720d" containerName="pull" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205305 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f2718f7-100d-4eb9-95e7-c0876b684457" containerName="util" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205313 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2718f7-100d-4eb9-95e7-c0876b684457" containerName="util" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205329 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f2718f7-100d-4eb9-95e7-c0876b684457" containerName="pull" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205336 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2718f7-100d-4eb9-95e7-c0876b684457" containerName="pull" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205343 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" containerName="extract" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205353 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" containerName="extract" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205364 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="898e9c4d-742c-4786-837b-8d440551720d" containerName="extract" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205371 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="898e9c4d-742c-4786-837b-8d440551720d" containerName="extract" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205456 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3138a678-a10b-491b-a93d-627060e56cfd" containerName="extract" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205468 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="14a7a8c2-0fb0-40cf-9149-e1f9476ea6f1" containerName="extract" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205480 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f2718f7-100d-4eb9-95e7-c0876b684457" containerName="extract" Apr 16 14:08:31.205778 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.205493 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="898e9c4d-742c-4786-837b-8d440551720d" containerName="extract" Apr 16 14:08:31.209213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.209191 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q" Apr 16 14:08:31.211878 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.211857 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 14:08:31.211991 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.211963 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 14:08:31.212039 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.211964 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-825jj\"" Apr 16 14:08:31.212039 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.212021 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 14:08:31.218692 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.218663 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q"] Apr 16 14:08:31.340625 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.340589 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqkdq\" (UniqueName: \"kubernetes.io/projected/8efcabe6-6814-420b-9c29-97a711033251-kube-api-access-xqkdq\") pod \"dns-operator-controller-manager-844548ff4c-st69q\" (UID: \"8efcabe6-6814-420b-9c29-97a711033251\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q" Apr 16 14:08:31.441428 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.441392 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqkdq\" (UniqueName: \"kubernetes.io/projected/8efcabe6-6814-420b-9c29-97a711033251-kube-api-access-xqkdq\") pod \"dns-operator-controller-manager-844548ff4c-st69q\" (UID: \"8efcabe6-6814-420b-9c29-97a711033251\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q" Apr 16 14:08:31.453044 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.453009 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqkdq\" (UniqueName: \"kubernetes.io/projected/8efcabe6-6814-420b-9c29-97a711033251-kube-api-access-xqkdq\") pod \"dns-operator-controller-manager-844548ff4c-st69q\" (UID: \"8efcabe6-6814-420b-9c29-97a711033251\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q" Apr 16 14:08:31.519867 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.519785 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q" Apr 16 14:08:31.652363 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.652331 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q"] Apr 16 14:08:31.653768 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:08:31.653743 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8efcabe6_6814_420b_9c29_97a711033251.slice/crio-b18ec91af8ea187a6976f4a6d1640642f9508bd28af0580a333fbdcc6b091883 WatchSource:0}: Error finding container b18ec91af8ea187a6976f4a6d1640642f9508bd28af0580a333fbdcc6b091883: Status 404 returned error can't find the container with id b18ec91af8ea187a6976f4a6d1640642f9508bd28af0580a333fbdcc6b091883 Apr 16 14:08:31.989554 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:31.989511 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q" event={"ID":"8efcabe6-6814-420b-9c29-97a711033251","Type":"ContainerStarted","Data":"b18ec91af8ea187a6976f4a6d1640642f9508bd28af0580a333fbdcc6b091883"} Apr 16 14:08:33.483494 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.483455 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb"] Apr 16 14:08:33.491601 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.491561 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb"] Apr 16 14:08:33.491767 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.491697 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:33.494679 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.494646 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 14:08:33.494815 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.494733 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 14:08:33.494815 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.494751 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5hx64\"" Apr 16 14:08:33.561413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.561378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/68459a75-dbc4-442c-9ed9-f18277a1b21d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-l48sb\" (UID: \"68459a75-dbc4-442c-9ed9-f18277a1b21d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:33.561585 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.561450 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/68459a75-dbc4-442c-9ed9-f18277a1b21d-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-l48sb\" (UID: \"68459a75-dbc4-442c-9ed9-f18277a1b21d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:33.561585 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.561483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkcnb\" (UniqueName: \"kubernetes.io/projected/68459a75-dbc4-442c-9ed9-f18277a1b21d-kube-api-access-fkcnb\") pod \"kuadrant-console-plugin-6c886788f8-l48sb\" (UID: \"68459a75-dbc4-442c-9ed9-f18277a1b21d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:33.662458 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.662425 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/68459a75-dbc4-442c-9ed9-f18277a1b21d-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-l48sb\" (UID: \"68459a75-dbc4-442c-9ed9-f18277a1b21d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:33.662658 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.662465 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkcnb\" (UniqueName: \"kubernetes.io/projected/68459a75-dbc4-442c-9ed9-f18277a1b21d-kube-api-access-fkcnb\") pod \"kuadrant-console-plugin-6c886788f8-l48sb\" (UID: \"68459a75-dbc4-442c-9ed9-f18277a1b21d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:33.662658 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.662618 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/68459a75-dbc4-442c-9ed9-f18277a1b21d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-l48sb\" (UID: \"68459a75-dbc4-442c-9ed9-f18277a1b21d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:33.663127 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.663104 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/68459a75-dbc4-442c-9ed9-f18277a1b21d-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-l48sb\" (UID: \"68459a75-dbc4-442c-9ed9-f18277a1b21d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:33.665351 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.665331 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/68459a75-dbc4-442c-9ed9-f18277a1b21d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-l48sb\" (UID: \"68459a75-dbc4-442c-9ed9-f18277a1b21d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:33.672449 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.672421 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkcnb\" (UniqueName: \"kubernetes.io/projected/68459a75-dbc4-442c-9ed9-f18277a1b21d-kube-api-access-fkcnb\") pod \"kuadrant-console-plugin-6c886788f8-l48sb\" (UID: \"68459a75-dbc4-442c-9ed9-f18277a1b21d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:33.812231 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:33.812145 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" Apr 16 14:08:34.476567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:34.476546 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb"] Apr 16 14:08:34.478370 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:08:34.478342 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68459a75_dbc4_442c_9ed9_f18277a1b21d.slice/crio-3ce264285f5d420469b96004282acf536082ffd66a527b94d70e2d74121affa5 WatchSource:0}: Error finding container 3ce264285f5d420469b96004282acf536082ffd66a527b94d70e2d74121affa5: Status 404 returned error can't find the container with id 3ce264285f5d420469b96004282acf536082ffd66a527b94d70e2d74121affa5 Apr 16 14:08:35.003651 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:35.003594 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q" event={"ID":"8efcabe6-6814-420b-9c29-97a711033251","Type":"ContainerStarted","Data":"ca79fcc0b4f69a467498e3d52b95e3a4e2ee24b49cb2fda0242f87fcafe72057"} Apr 16 14:08:35.004101 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:35.003725 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q" Apr 16 14:08:35.004767 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:35.004747 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" event={"ID":"68459a75-dbc4-442c-9ed9-f18277a1b21d","Type":"ContainerStarted","Data":"3ce264285f5d420469b96004282acf536082ffd66a527b94d70e2d74121affa5"} Apr 16 14:08:35.025157 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:35.025109 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q" podStartSLOduration=1.261419209 podStartE2EDuration="4.025095843s" podCreationTimestamp="2026-04-16 14:08:31 +0000 UTC" firstStartedPulling="2026-04-16 14:08:31.655979619 +0000 UTC m=+535.092145329" lastFinishedPulling="2026-04-16 14:08:34.419656253 +0000 UTC m=+537.855821963" observedRunningTime="2026-04-16 14:08:35.023221434 +0000 UTC m=+538.459387162" watchObservedRunningTime="2026-04-16 14:08:35.025095843 +0000 UTC m=+538.461261570" Apr 16 14:08:40.025512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:40.025469 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" event={"ID":"68459a75-dbc4-442c-9ed9-f18277a1b21d","Type":"ContainerStarted","Data":"81721e220265881a9f1309f01911f99f03ca9186a11626d81afa677093ded091"} Apr 16 14:08:40.050147 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:40.050093 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-l48sb" podStartSLOduration=2.17750211 podStartE2EDuration="7.050076103s" podCreationTimestamp="2026-04-16 14:08:33 +0000 UTC" firstStartedPulling="2026-04-16 14:08:34.47974455 +0000 UTC m=+537.915910258" lastFinishedPulling="2026-04-16 14:08:39.352318532 +0000 UTC m=+542.788484251" observedRunningTime="2026-04-16 14:08:40.048816836 +0000 UTC m=+543.484982565" watchObservedRunningTime="2026-04-16 14:08:40.050076103 +0000 UTC m=+543.486241833" Apr 16 14:08:46.011176 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:08:46.011143 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-st69q" Apr 16 14:09:14.426928 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.426886 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-m2pvj"] Apr 16 14:09:14.452915 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.452878 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-m2pvj"] Apr 16 14:09:14.453064 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.452997 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" Apr 16 14:09:14.455764 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.455743 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 14:09:14.461601 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.461573 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-m2pvj"] Apr 16 14:09:14.508693 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.508650 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7j7t\" (UniqueName: \"kubernetes.io/projected/69a494be-503e-4309-a889-aed428c35e00-kube-api-access-t7j7t\") pod \"limitador-limitador-67566c68b4-m2pvj\" (UID: \"69a494be-503e-4309-a889-aed428c35e00\") " pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" Apr 16 14:09:14.508868 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.508732 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/69a494be-503e-4309-a889-aed428c35e00-config-file\") pod \"limitador-limitador-67566c68b4-m2pvj\" (UID: \"69a494be-503e-4309-a889-aed428c35e00\") " pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" Apr 16 14:09:14.610040 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.609998 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/69a494be-503e-4309-a889-aed428c35e00-config-file\") pod \"limitador-limitador-67566c68b4-m2pvj\" (UID: \"69a494be-503e-4309-a889-aed428c35e00\") " pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" Apr 16 14:09:14.610243 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.610083 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7j7t\" (UniqueName: \"kubernetes.io/projected/69a494be-503e-4309-a889-aed428c35e00-kube-api-access-t7j7t\") pod \"limitador-limitador-67566c68b4-m2pvj\" (UID: \"69a494be-503e-4309-a889-aed428c35e00\") " pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" Apr 16 14:09:14.610718 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.610693 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/69a494be-503e-4309-a889-aed428c35e00-config-file\") pod \"limitador-limitador-67566c68b4-m2pvj\" (UID: \"69a494be-503e-4309-a889-aed428c35e00\") " pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" Apr 16 14:09:14.619171 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.619143 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7j7t\" (UniqueName: \"kubernetes.io/projected/69a494be-503e-4309-a889-aed428c35e00-kube-api-access-t7j7t\") pod \"limitador-limitador-67566c68b4-m2pvj\" (UID: \"69a494be-503e-4309-a889-aed428c35e00\") " pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" Apr 16 14:09:14.764628 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.764545 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" Apr 16 14:09:14.895485 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.895464 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-m2pvj"] Apr 16 14:09:14.897867 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:09:14.897829 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a494be_503e_4309_a889_aed428c35e00.slice/crio-e4fe3001e5c85e9d02d0c59d26648aef2a6b881dc5415b2337d7feae0f0fb020 WatchSource:0}: Error finding container e4fe3001e5c85e9d02d0c59d26648aef2a6b881dc5415b2337d7feae0f0fb020: Status 404 returned error can't find the container with id e4fe3001e5c85e9d02d0c59d26648aef2a6b881dc5415b2337d7feae0f0fb020 Apr 16 14:09:14.945637 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.945608 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-thptb"] Apr 16 14:09:14.950413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.950393 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-thptb" Apr 16 14:09:14.953016 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.952992 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-p96kd\"" Apr 16 14:09:14.955166 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:14.955142 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-thptb"] Apr 16 14:09:15.013039 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:15.013003 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2dkc\" (UniqueName: \"kubernetes.io/projected/7500633f-6d95-4e37-8094-3310c6a8b18a-kube-api-access-q2dkc\") pod \"authorino-674b59b84c-thptb\" (UID: \"7500633f-6d95-4e37-8094-3310c6a8b18a\") " pod="kuadrant-system/authorino-674b59b84c-thptb" Apr 16 14:09:15.113803 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:15.113768 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2dkc\" (UniqueName: \"kubernetes.io/projected/7500633f-6d95-4e37-8094-3310c6a8b18a-kube-api-access-q2dkc\") pod \"authorino-674b59b84c-thptb\" (UID: \"7500633f-6d95-4e37-8094-3310c6a8b18a\") " pod="kuadrant-system/authorino-674b59b84c-thptb" Apr 16 14:09:15.122537 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:15.122512 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2dkc\" (UniqueName: \"kubernetes.io/projected/7500633f-6d95-4e37-8094-3310c6a8b18a-kube-api-access-q2dkc\") pod \"authorino-674b59b84c-thptb\" (UID: \"7500633f-6d95-4e37-8094-3310c6a8b18a\") " pod="kuadrant-system/authorino-674b59b84c-thptb" Apr 16 14:09:15.153947 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:15.153915 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" event={"ID":"69a494be-503e-4309-a889-aed428c35e00","Type":"ContainerStarted","Data":"e4fe3001e5c85e9d02d0c59d26648aef2a6b881dc5415b2337d7feae0f0fb020"} Apr 16 14:09:15.260686 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:15.260651 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-thptb" Apr 16 14:09:15.385683 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:15.385662 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-thptb"] Apr 16 14:09:15.387362 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:09:15.387336 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7500633f_6d95_4e37_8094_3310c6a8b18a.slice/crio-21efd6d4e574a84e9e0534db93d84531f871b1ad64893befd97089022664f068 WatchSource:0}: Error finding container 21efd6d4e574a84e9e0534db93d84531f871b1ad64893befd97089022664f068: Status 404 returned error can't find the container with id 21efd6d4e574a84e9e0534db93d84531f871b1ad64893befd97089022664f068 Apr 16 14:09:16.159568 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:16.159521 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-thptb" event={"ID":"7500633f-6d95-4e37-8094-3310c6a8b18a","Type":"ContainerStarted","Data":"21efd6d4e574a84e9e0534db93d84531f871b1ad64893befd97089022664f068"} Apr 16 14:09:18.714074 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:18.714034 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-thptb"] Apr 16 14:09:19.172373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:19.172335 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" event={"ID":"69a494be-503e-4309-a889-aed428c35e00","Type":"ContainerStarted","Data":"a154c1e362482860966d7941cc38a01dbb4ef68db26367cfc341beb7feb21f75"} Apr 16 14:09:19.172534 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:19.172440 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" Apr 16 14:09:19.173687 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:19.173663 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-thptb" event={"ID":"7500633f-6d95-4e37-8094-3310c6a8b18a","Type":"ContainerStarted","Data":"dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e"} Apr 16 14:09:19.191690 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:19.191642 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" podStartSLOduration=1.055634278 podStartE2EDuration="5.191626946s" podCreationTimestamp="2026-04-16 14:09:14 +0000 UTC" firstStartedPulling="2026-04-16 14:09:14.90021148 +0000 UTC m=+578.336377186" lastFinishedPulling="2026-04-16 14:09:19.036204135 +0000 UTC m=+582.472369854" observedRunningTime="2026-04-16 14:09:19.190097614 +0000 UTC m=+582.626263343" watchObservedRunningTime="2026-04-16 14:09:19.191626946 +0000 UTC m=+582.627792674" Apr 16 14:09:19.207312 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:19.207239 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-thptb" podStartSLOduration=2.102075432 podStartE2EDuration="5.207225507s" podCreationTimestamp="2026-04-16 14:09:14 +0000 UTC" firstStartedPulling="2026-04-16 14:09:15.388866823 +0000 UTC m=+578.825032529" lastFinishedPulling="2026-04-16 14:09:18.494016889 +0000 UTC m=+581.930182604" observedRunningTime="2026-04-16 14:09:19.205408261 +0000 UTC m=+582.641573989" watchObservedRunningTime="2026-04-16 14:09:19.207225507 +0000 UTC m=+582.643391282" Apr 16 14:09:20.177305 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:20.177252 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-thptb" podUID="7500633f-6d95-4e37-8094-3310c6a8b18a" containerName="authorino" containerID="cri-o://dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e" gracePeriod=30 Apr 16 14:09:20.422816 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:20.422794 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-thptb" Apr 16 14:09:20.461699 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:20.461617 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2dkc\" (UniqueName: \"kubernetes.io/projected/7500633f-6d95-4e37-8094-3310c6a8b18a-kube-api-access-q2dkc\") pod \"7500633f-6d95-4e37-8094-3310c6a8b18a\" (UID: \"7500633f-6d95-4e37-8094-3310c6a8b18a\") " Apr 16 14:09:20.463943 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:20.463909 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7500633f-6d95-4e37-8094-3310c6a8b18a-kube-api-access-q2dkc" (OuterVolumeSpecName: "kube-api-access-q2dkc") pod "7500633f-6d95-4e37-8094-3310c6a8b18a" (UID: "7500633f-6d95-4e37-8094-3310c6a8b18a"). InnerVolumeSpecName "kube-api-access-q2dkc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:09:20.562137 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:20.562085 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q2dkc\" (UniqueName: \"kubernetes.io/projected/7500633f-6d95-4e37-8094-3310c6a8b18a-kube-api-access-q2dkc\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:09:21.182059 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:21.182024 2580 generic.go:358] "Generic (PLEG): container finished" podID="7500633f-6d95-4e37-8094-3310c6a8b18a" containerID="dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e" exitCode=0 Apr 16 14:09:21.182505 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:21.182074 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-thptb" Apr 16 14:09:21.182505 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:21.182109 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-thptb" event={"ID":"7500633f-6d95-4e37-8094-3310c6a8b18a","Type":"ContainerDied","Data":"dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e"} Apr 16 14:09:21.182505 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:21.182146 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-thptb" event={"ID":"7500633f-6d95-4e37-8094-3310c6a8b18a","Type":"ContainerDied","Data":"21efd6d4e574a84e9e0534db93d84531f871b1ad64893befd97089022664f068"} Apr 16 14:09:21.182505 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:21.182162 2580 scope.go:117] "RemoveContainer" containerID="dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e" Apr 16 14:09:21.191390 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:21.191370 2580 scope.go:117] "RemoveContainer" containerID="dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e" Apr 16 14:09:21.191660 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:09:21.191642 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e\": container with ID starting with dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e not found: ID does not exist" containerID="dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e" Apr 16 14:09:21.191711 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:21.191669 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e"} err="failed to get container status \"dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e\": rpc error: code = NotFound desc = could not find container \"dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e\": container with ID starting with dcae0a6e954239017f9566ee25cd3c8cb84b5d3972c6b40911bfbab57472461e not found: ID does not exist" Apr 16 14:09:21.198893 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:21.198870 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-thptb"] Apr 16 14:09:21.202386 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:21.202364 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-thptb"] Apr 16 14:09:23.152657 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:23.152624 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7500633f-6d95-4e37-8094-3310c6a8b18a" path="/var/lib/kubelet/pods/7500633f-6d95-4e37-8094-3310c6a8b18a/volumes" Apr 16 14:09:30.178213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:30.178184 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-m2pvj" Apr 16 14:09:37.058291 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:37.058239 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:09:37.059965 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:09:37.059938 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:11:25.876564 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.876479 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-wdq45"] Apr 16 14:11:25.877030 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.877014 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7500633f-6d95-4e37-8094-3310c6a8b18a" containerName="authorino" Apr 16 14:11:25.877072 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.877034 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7500633f-6d95-4e37-8094-3310c6a8b18a" containerName="authorino" Apr 16 14:11:25.877143 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.877133 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7500633f-6d95-4e37-8094-3310c6a8b18a" containerName="authorino" Apr 16 14:11:25.880433 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.880415 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wdq45" Apr 16 14:11:25.882795 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.882771 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vk2km\"" Apr 16 14:11:25.883034 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.882772 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:11:25.883126 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.882904 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:11:25.883222 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.882943 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 14:11:25.884558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.884532 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-wdq45"] Apr 16 14:11:25.939020 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:25.938983 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4t5d\" (UniqueName: \"kubernetes.io/projected/48fcba8d-e02e-401a-a8c8-04df4abb087a-kube-api-access-z4t5d\") pod \"s3-init-wdq45\" (UID: \"48fcba8d-e02e-401a-a8c8-04df4abb087a\") " pod="kserve/s3-init-wdq45" Apr 16 14:11:26.040321 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:26.040262 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4t5d\" (UniqueName: \"kubernetes.io/projected/48fcba8d-e02e-401a-a8c8-04df4abb087a-kube-api-access-z4t5d\") pod \"s3-init-wdq45\" (UID: \"48fcba8d-e02e-401a-a8c8-04df4abb087a\") " pod="kserve/s3-init-wdq45" Apr 16 14:11:26.048673 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:26.048641 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4t5d\" (UniqueName: \"kubernetes.io/projected/48fcba8d-e02e-401a-a8c8-04df4abb087a-kube-api-access-z4t5d\") pod \"s3-init-wdq45\" (UID: \"48fcba8d-e02e-401a-a8c8-04df4abb087a\") " pod="kserve/s3-init-wdq45" Apr 16 14:11:26.191730 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:26.191354 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wdq45" Apr 16 14:11:26.321309 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:26.321281 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-wdq45"] Apr 16 14:11:26.323599 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:11:26.323564 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fcba8d_e02e_401a_a8c8_04df4abb087a.slice/crio-86037e26061e2bd490ca5731be28b01cdf42258032b30861864cf0cc812a2471 WatchSource:0}: Error finding container 86037e26061e2bd490ca5731be28b01cdf42258032b30861864cf0cc812a2471: Status 404 returned error can't find the container with id 86037e26061e2bd490ca5731be28b01cdf42258032b30861864cf0cc812a2471 Apr 16 14:11:26.325511 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:26.325492 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:11:26.654514 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:26.654482 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wdq45" event={"ID":"48fcba8d-e02e-401a-a8c8-04df4abb087a","Type":"ContainerStarted","Data":"86037e26061e2bd490ca5731be28b01cdf42258032b30861864cf0cc812a2471"} Apr 16 14:11:31.679144 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:31.679097 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wdq45" event={"ID":"48fcba8d-e02e-401a-a8c8-04df4abb087a","Type":"ContainerStarted","Data":"95bb2f17aad6777fd142cc2a88ce7e56cf9a42adb8b8f2bfb684f6307625269d"} Apr 16 14:11:31.694238 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:31.694183 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-wdq45" podStartSLOduration=2.065958947 podStartE2EDuration="6.694163445s" podCreationTimestamp="2026-04-16 14:11:25 +0000 UTC" firstStartedPulling="2026-04-16 14:11:26.325632608 +0000 UTC m=+709.761798314" lastFinishedPulling="2026-04-16 14:11:30.953837102 +0000 UTC m=+714.390002812" observedRunningTime="2026-04-16 14:11:31.693628662 +0000 UTC m=+715.129794383" watchObservedRunningTime="2026-04-16 14:11:31.694163445 +0000 UTC m=+715.130329173" Apr 16 14:11:34.691897 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:34.691860 2580 generic.go:358] "Generic (PLEG): container finished" podID="48fcba8d-e02e-401a-a8c8-04df4abb087a" containerID="95bb2f17aad6777fd142cc2a88ce7e56cf9a42adb8b8f2bfb684f6307625269d" exitCode=0 Apr 16 14:11:34.692295 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:34.691936 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wdq45" event={"ID":"48fcba8d-e02e-401a-a8c8-04df4abb087a","Type":"ContainerDied","Data":"95bb2f17aad6777fd142cc2a88ce7e56cf9a42adb8b8f2bfb684f6307625269d"} Apr 16 14:11:35.837598 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:35.837572 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wdq45" Apr 16 14:11:35.934054 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:35.934020 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4t5d\" (UniqueName: \"kubernetes.io/projected/48fcba8d-e02e-401a-a8c8-04df4abb087a-kube-api-access-z4t5d\") pod \"48fcba8d-e02e-401a-a8c8-04df4abb087a\" (UID: \"48fcba8d-e02e-401a-a8c8-04df4abb087a\") " Apr 16 14:11:35.936401 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:35.936372 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fcba8d-e02e-401a-a8c8-04df4abb087a-kube-api-access-z4t5d" (OuterVolumeSpecName: "kube-api-access-z4t5d") pod "48fcba8d-e02e-401a-a8c8-04df4abb087a" (UID: "48fcba8d-e02e-401a-a8c8-04df4abb087a"). InnerVolumeSpecName "kube-api-access-z4t5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:11:36.035388 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:36.035259 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4t5d\" (UniqueName: \"kubernetes.io/projected/48fcba8d-e02e-401a-a8c8-04df4abb087a-kube-api-access-z4t5d\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:11:36.700419 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:36.700383 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wdq45" Apr 16 14:11:36.700419 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:36.700415 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wdq45" event={"ID":"48fcba8d-e02e-401a-a8c8-04df4abb087a","Type":"ContainerDied","Data":"86037e26061e2bd490ca5731be28b01cdf42258032b30861864cf0cc812a2471"} Apr 16 14:11:36.700675 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:11:36.700447 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86037e26061e2bd490ca5731be28b01cdf42258032b30861864cf0cc812a2471" Apr 16 14:12:14.702802 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.702757 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4"] Apr 16 14:12:14.703254 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.703241 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48fcba8d-e02e-401a-a8c8-04df4abb087a" containerName="s3-init" Apr 16 14:12:14.703341 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.703260 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fcba8d-e02e-401a-a8c8-04df4abb087a" containerName="s3-init" Apr 16 14:12:14.703425 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.703413 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="48fcba8d-e02e-401a-a8c8-04df4abb087a" containerName="s3-init" Apr 16 14:12:14.706752 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.706730 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.709499 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.709473 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:12:14.710636 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.710508 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:12:14.710636 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.710537 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:12:14.710636 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.710537 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 14:12:14.719119 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.719090 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4"] Apr 16 14:12:14.771114 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.771079 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.771114 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.771117 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frsrq\" (UniqueName: \"kubernetes.io/projected/92e4843b-7845-45af-bf1c-09854479c24b-kube-api-access-frsrq\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.771384 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.771154 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.771384 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.771188 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92e4843b-7845-45af-bf1c-09854479c24b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.771384 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.771217 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.771384 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.771236 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.872094 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.872036 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92e4843b-7845-45af-bf1c-09854479c24b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.872094 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.872100 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.872413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.872124 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.872413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.872163 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.872413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.872181 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frsrq\" (UniqueName: \"kubernetes.io/projected/92e4843b-7845-45af-bf1c-09854479c24b-kube-api-access-frsrq\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.872413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.872215 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.872625 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.872550 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.872625 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.872593 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.872728 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.872662 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.874628 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.874597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.874920 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.874900 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92e4843b-7845-45af-bf1c-09854479c24b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:14.880053 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:14.880029 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frsrq\" (UniqueName: \"kubernetes.io/projected/92e4843b-7845-45af-bf1c-09854479c24b-kube-api-access-frsrq\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:15.021205 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:15.021118 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:15.155084 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:15.155056 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4"] Apr 16 14:12:15.156663 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:12:15.156621 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92e4843b_7845_45af_bf1c_09854479c24b.slice/crio-e8b7f7553cbaaf51d794abbec11b6d191b0fd0803b36356fceaa4ddaea6a8183 WatchSource:0}: Error finding container e8b7f7553cbaaf51d794abbec11b6d191b0fd0803b36356fceaa4ddaea6a8183: Status 404 returned error can't find the container with id e8b7f7553cbaaf51d794abbec11b6d191b0fd0803b36356fceaa4ddaea6a8183 Apr 16 14:12:15.844999 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:15.844960 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" event={"ID":"92e4843b-7845-45af-bf1c-09854479c24b","Type":"ContainerStarted","Data":"e8b7f7553cbaaf51d794abbec11b6d191b0fd0803b36356fceaa4ddaea6a8183"} Apr 16 14:12:19.865194 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:19.865146 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" event={"ID":"92e4843b-7845-45af-bf1c-09854479c24b","Type":"ContainerStarted","Data":"11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4"} Apr 16 14:12:23.882186 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:23.882151 2580 generic.go:358] "Generic (PLEG): container finished" podID="92e4843b-7845-45af-bf1c-09854479c24b" containerID="11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4" exitCode=0 Apr 16 14:12:23.882576 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:23.882226 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" event={"ID":"92e4843b-7845-45af-bf1c-09854479c24b","Type":"ContainerDied","Data":"11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4"} Apr 16 14:12:25.893406 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:25.893370 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" event={"ID":"92e4843b-7845-45af-bf1c-09854479c24b","Type":"ContainerStarted","Data":"1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a"} Apr 16 14:12:25.914482 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:25.914432 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" podStartSLOduration=2.024033919 podStartE2EDuration="11.914417765s" podCreationTimestamp="2026-04-16 14:12:14 +0000 UTC" firstStartedPulling="2026-04-16 14:12:15.158304959 +0000 UTC m=+758.594470666" lastFinishedPulling="2026-04-16 14:12:25.048688795 +0000 UTC m=+768.484854512" observedRunningTime="2026-04-16 14:12:25.911867754 +0000 UTC m=+769.348033482" watchObservedRunningTime="2026-04-16 14:12:25.914417765 +0000 UTC m=+769.350583493" Apr 16 14:12:35.022015 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:35.021977 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:35.022015 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:35.022023 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:35.034936 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:35.034908 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:12:35.955139 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:12:35.955111 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:13:03.304992 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.304906 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4"] Apr 16 14:13:03.305478 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.305302 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" podUID="92e4843b-7845-45af-bf1c-09854479c24b" containerName="main" containerID="cri-o://1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a" gracePeriod=30 Apr 16 14:13:03.547334 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.547305 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:13:03.602706 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.602678 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-dshm\") pod \"92e4843b-7845-45af-bf1c-09854479c24b\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " Apr 16 14:13:03.602869 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.602721 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92e4843b-7845-45af-bf1c-09854479c24b-tls-certs\") pod \"92e4843b-7845-45af-bf1c-09854479c24b\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " Apr 16 14:13:03.602869 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.602755 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frsrq\" (UniqueName: \"kubernetes.io/projected/92e4843b-7845-45af-bf1c-09854479c24b-kube-api-access-frsrq\") pod \"92e4843b-7845-45af-bf1c-09854479c24b\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " Apr 16 14:13:03.602869 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.602773 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-model-cache\") pod \"92e4843b-7845-45af-bf1c-09854479c24b\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " Apr 16 14:13:03.602869 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.602814 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-kserve-provision-location\") pod \"92e4843b-7845-45af-bf1c-09854479c24b\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " Apr 16 14:13:03.602869 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.602855 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-home\") pod \"92e4843b-7845-45af-bf1c-09854479c24b\" (UID: \"92e4843b-7845-45af-bf1c-09854479c24b\") " Apr 16 14:13:03.603138 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.603070 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-model-cache" (OuterVolumeSpecName: "model-cache") pod "92e4843b-7845-45af-bf1c-09854479c24b" (UID: "92e4843b-7845-45af-bf1c-09854479c24b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:03.603209 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.603175 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-home" (OuterVolumeSpecName: "home") pod "92e4843b-7845-45af-bf1c-09854479c24b" (UID: "92e4843b-7845-45af-bf1c-09854479c24b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:03.605115 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.605092 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-dshm" (OuterVolumeSpecName: "dshm") pod "92e4843b-7845-45af-bf1c-09854479c24b" (UID: "92e4843b-7845-45af-bf1c-09854479c24b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:03.605195 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.605164 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92e4843b-7845-45af-bf1c-09854479c24b-kube-api-access-frsrq" (OuterVolumeSpecName: "kube-api-access-frsrq") pod "92e4843b-7845-45af-bf1c-09854479c24b" (UID: "92e4843b-7845-45af-bf1c-09854479c24b"). InnerVolumeSpecName "kube-api-access-frsrq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:13:03.605239 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.605227 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e4843b-7845-45af-bf1c-09854479c24b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "92e4843b-7845-45af-bf1c-09854479c24b" (UID: "92e4843b-7845-45af-bf1c-09854479c24b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:13:03.658295 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.658227 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "92e4843b-7845-45af-bf1c-09854479c24b" (UID: "92e4843b-7845-45af-bf1c-09854479c24b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:03.704237 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.704205 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-frsrq\" (UniqueName: \"kubernetes.io/projected/92e4843b-7845-45af-bf1c-09854479c24b-kube-api-access-frsrq\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:13:03.704237 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.704236 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:13:03.704395 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.704248 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:13:03.704395 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.704258 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:13:03.704395 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.704294 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/92e4843b-7845-45af-bf1c-09854479c24b-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:13:03.704395 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:03.704307 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92e4843b-7845-45af-bf1c-09854479c24b-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:13:04.065423 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.065335 2580 generic.go:358] "Generic (PLEG): container finished" podID="92e4843b-7845-45af-bf1c-09854479c24b" containerID="1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a" exitCode=0 Apr 16 14:13:04.065423 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.065402 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" Apr 16 14:13:04.065599 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.065424 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" event={"ID":"92e4843b-7845-45af-bf1c-09854479c24b","Type":"ContainerDied","Data":"1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a"} Apr 16 14:13:04.065599 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.065470 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4" event={"ID":"92e4843b-7845-45af-bf1c-09854479c24b","Type":"ContainerDied","Data":"e8b7f7553cbaaf51d794abbec11b6d191b0fd0803b36356fceaa4ddaea6a8183"} Apr 16 14:13:04.065599 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.065490 2580 scope.go:117] "RemoveContainer" containerID="1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a" Apr 16 14:13:04.075417 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.075395 2580 scope.go:117] "RemoveContainer" containerID="11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4" Apr 16 14:13:04.087970 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.087940 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4"] Apr 16 14:13:04.091626 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.091601 2580 scope.go:117] "RemoveContainer" containerID="1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a" Apr 16 14:13:04.091995 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:13:04.091973 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a\": container with ID starting with 1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a not found: ID does not exist" containerID="1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a" Apr 16 14:13:04.092066 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.092006 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a"} err="failed to get container status \"1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a\": rpc error: code = NotFound desc = could not find container \"1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a\": container with ID starting with 1887b10607461c6e87bee9862e0343a19f130eed1097d1323e10aab675270e5a not found: ID does not exist" Apr 16 14:13:04.092066 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.092032 2580 scope.go:117] "RemoveContainer" containerID="11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4" Apr 16 14:13:04.092312 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:13:04.092295 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4\": container with ID starting with 11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4 not found: ID does not exist" containerID="11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4" Apr 16 14:13:04.092368 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.092318 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4"} err="failed to get container status \"11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4\": rpc error: code = NotFound desc = could not find container \"11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4\": container with ID starting with 11bb7d47843ad6eef73c8907326d034e175f1cdd5bd7604e8a79e5dae77ecbb4 not found: ID does not exist" Apr 16 14:13:04.094292 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:04.094252 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-55c865954fcbkj4"] Apr 16 14:13:05.152825 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:05.152791 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92e4843b-7845-45af-bf1c-09854479c24b" path="/var/lib/kubelet/pods/92e4843b-7845-45af-bf1c-09854479c24b/volumes" Apr 16 14:13:11.121021 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.120984 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765"] Apr 16 14:13:11.121893 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.121336 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92e4843b-7845-45af-bf1c-09854479c24b" containerName="main" Apr 16 14:13:11.121893 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.121347 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e4843b-7845-45af-bf1c-09854479c24b" containerName="main" Apr 16 14:13:11.121893 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.121373 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92e4843b-7845-45af-bf1c-09854479c24b" containerName="storage-initializer" Apr 16 14:13:11.121893 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.121379 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e4843b-7845-45af-bf1c-09854479c24b" containerName="storage-initializer" Apr 16 14:13:11.121893 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.121430 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="92e4843b-7845-45af-bf1c-09854479c24b" containerName="main" Apr 16 14:13:11.126748 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.126726 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.130055 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.130029 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lsh9l\"" Apr 16 14:13:11.137679 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.130052 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:13:11.137900 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.130084 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:13:11.138046 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.130262 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 14:13:11.141989 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.141961 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765"] Apr 16 14:13:11.268733 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.268692 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.268733 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.268738 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.268982 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.268772 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.268982 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.268812 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbdgs\" (UniqueName: \"kubernetes.io/projected/15554d41-66c7-4359-90e3-094b45c2da49-kube-api-access-xbdgs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.268982 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.268878 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15554d41-66c7-4359-90e3-094b45c2da49-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.268982 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.268964 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.370112 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.370072 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.370112 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.370115 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.370410 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.370156 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.370410 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.370186 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbdgs\" (UniqueName: \"kubernetes.io/projected/15554d41-66c7-4359-90e3-094b45c2da49-kube-api-access-xbdgs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.370410 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.370344 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15554d41-66c7-4359-90e3-094b45c2da49-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.370542 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.370444 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.370542 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.370525 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.370615 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.370593 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.370742 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.370724 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.372624 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.372565 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.373182 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.373161 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15554d41-66c7-4359-90e3-094b45c2da49-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.390014 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.389982 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbdgs\" (UniqueName: \"kubernetes.io/projected/15554d41-66c7-4359-90e3-094b45c2da49-kube-api-access-xbdgs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.448059 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.448008 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:13:11.586038 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:11.586012 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765"] Apr 16 14:13:11.588299 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:13:11.588238 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15554d41_66c7_4359_90e3_094b45c2da49.slice/crio-059b02d3ca36c14f0a2d6a7f2d05ac9d31d3fbcdf35b30ca64f6d01139c8d8cf WatchSource:0}: Error finding container 059b02d3ca36c14f0a2d6a7f2d05ac9d31d3fbcdf35b30ca64f6d01139c8d8cf: Status 404 returned error can't find the container with id 059b02d3ca36c14f0a2d6a7f2d05ac9d31d3fbcdf35b30ca64f6d01139c8d8cf Apr 16 14:13:12.097580 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:12.097483 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" event={"ID":"15554d41-66c7-4359-90e3-094b45c2da49","Type":"ContainerStarted","Data":"8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0"} Apr 16 14:13:12.097580 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:12.097536 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" event={"ID":"15554d41-66c7-4359-90e3-094b45c2da49","Type":"ContainerStarted","Data":"059b02d3ca36c14f0a2d6a7f2d05ac9d31d3fbcdf35b30ca64f6d01139c8d8cf"} Apr 16 14:13:16.118199 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:16.118163 2580 generic.go:358] "Generic (PLEG): container finished" podID="15554d41-66c7-4359-90e3-094b45c2da49" containerID="8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0" exitCode=0 Apr 16 14:13:16.118600 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:16.118240 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" event={"ID":"15554d41-66c7-4359-90e3-094b45c2da49","Type":"ContainerDied","Data":"8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0"} Apr 16 14:13:19.298072 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.298032 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58"] Apr 16 14:13:19.303074 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.303058 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.305526 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.305501 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-xnnkk\"" Apr 16 14:13:19.305655 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.305547 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 14:13:19.313686 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.313661 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58"] Apr 16 14:13:19.441632 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.441593 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.441826 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.441639 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.441826 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.441667 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb9z2\" (UniqueName: \"kubernetes.io/projected/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kube-api-access-sb9z2\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.441826 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.441698 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.441826 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.441797 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.442051 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.441830 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.543131 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.543092 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.543402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.543142 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.543402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.543169 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb9z2\" (UniqueName: \"kubernetes.io/projected/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kube-api-access-sb9z2\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.543402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.543211 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.543402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.543301 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.543402 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.543342 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.543681 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.543606 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.543681 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.543652 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.543795 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.543685 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.543898 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.543866 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.546288 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.546242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.552108 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.552043 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb9z2\" (UniqueName: \"kubernetes.io/projected/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kube-api-access-sb9z2\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.614147 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.614111 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:13:19.777022 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:19.776992 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58"] Apr 16 14:13:19.779184 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:13:19.779150 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff5e36b0_d3de_4cbd_8df6_1a647f9e05be.slice/crio-94ee0bdf20dab6c61bbb9dc88069abd6cb28485805f15aed1c5eafacf093d145 WatchSource:0}: Error finding container 94ee0bdf20dab6c61bbb9dc88069abd6cb28485805f15aed1c5eafacf093d145: Status 404 returned error can't find the container with id 94ee0bdf20dab6c61bbb9dc88069abd6cb28485805f15aed1c5eafacf093d145 Apr 16 14:13:20.139419 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:20.139367 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" event={"ID":"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be","Type":"ContainerStarted","Data":"d644586b73e6c4ee85f8c65d059ac7e792c85b5505eeb568d6be1374c691e71d"} Apr 16 14:13:20.139615 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:20.139428 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" event={"ID":"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be","Type":"ContainerStarted","Data":"94ee0bdf20dab6c61bbb9dc88069abd6cb28485805f15aed1c5eafacf093d145"} Apr 16 14:13:21.145023 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:21.144989 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerID="d644586b73e6c4ee85f8c65d059ac7e792c85b5505eeb568d6be1374c691e71d" exitCode=0 Apr 16 14:13:21.145445 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:21.145079 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" event={"ID":"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be","Type":"ContainerDied","Data":"d644586b73e6c4ee85f8c65d059ac7e792c85b5505eeb568d6be1374c691e71d"} Apr 16 14:13:23.159592 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:13:23.159553 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" event={"ID":"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be","Type":"ContainerStarted","Data":"0ccbb5d0f021e114aabdba1a9b0690a26ca165463fb850a561a5cdcc5593d340"} Apr 16 14:14:01.646927 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:01.646892 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58"] Apr 16 14:14:08.378960 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:08.378920 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" event={"ID":"15554d41-66c7-4359-90e3-094b45c2da49","Type":"ContainerStarted","Data":"6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f"} Apr 16 14:14:08.400348 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:08.400257 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podStartSLOduration=5.277479561 podStartE2EDuration="57.400235749s" podCreationTimestamp="2026-04-16 14:13:11 +0000 UTC" firstStartedPulling="2026-04-16 14:13:16.119409682 +0000 UTC m=+819.555575392" lastFinishedPulling="2026-04-16 14:14:08.242165871 +0000 UTC m=+871.678331580" observedRunningTime="2026-04-16 14:14:08.399357512 +0000 UTC m=+871.835523241" watchObservedRunningTime="2026-04-16 14:14:08.400235749 +0000 UTC m=+871.836401475" Apr 16 14:14:09.385770 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:09.385724 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" event={"ID":"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be","Type":"ContainerStarted","Data":"1171af85a4f016dc48c250f47599c8844a422d51fdb0b8887208f7e1c55d75c3"} Apr 16 14:14:09.386230 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:09.385857 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="main" containerID="cri-o://0ccbb5d0f021e114aabdba1a9b0690a26ca165463fb850a561a5cdcc5593d340" gracePeriod=30 Apr 16 14:14:09.386230 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:09.385977 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:14:09.386230 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:09.385916 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="tokenizer" containerID="cri-o://1171af85a4f016dc48c250f47599c8844a422d51fdb0b8887208f7e1c55d75c3" gracePeriod=30 Apr 16 14:14:09.389109 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:09.389075 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 14:14:09.406160 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:09.406098 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" podStartSLOduration=3.07907831 podStartE2EDuration="50.406078561s" podCreationTimestamp="2026-04-16 14:13:19 +0000 UTC" firstStartedPulling="2026-04-16 14:13:21.146114341 +0000 UTC m=+824.582280046" lastFinishedPulling="2026-04-16 14:14:08.473114576 +0000 UTC m=+871.909280297" observedRunningTime="2026-04-16 14:14:09.40591178 +0000 UTC m=+872.842077513" watchObservedRunningTime="2026-04-16 14:14:09.406078561 +0000 UTC m=+872.842244290" Apr 16 14:14:09.615187 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:09.615155 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:14:10.391899 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:10.391858 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerID="0ccbb5d0f021e114aabdba1a9b0690a26ca165463fb850a561a5cdcc5593d340" exitCode=0 Apr 16 14:14:10.392295 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:10.391938 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" event={"ID":"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be","Type":"ContainerDied","Data":"0ccbb5d0f021e114aabdba1a9b0690a26ca165463fb850a561a5cdcc5593d340"} Apr 16 14:14:11.448875 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:11.448837 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:14:11.449301 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:11.448994 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:14:11.450930 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:11.450901 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 14:14:19.387506 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:14:19.387469 2580 logging.go:55] [core] [Channel #20 SubChannel #21]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.47:9003", ServerName: "10.134.0.47:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.47:9003: connect: connection refused" Apr 16 14:14:20.386743 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:20.386698 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.47:9003\" within 1s: context deadline exceeded" Apr 16 14:14:21.448934 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:21.448888 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 14:14:29.387393 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:14:29.387299 2580 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.47:9003", ServerName: "10.134.0.47:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.47:9003: connect: connection refused" Apr 16 14:14:30.387393 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:30.387340 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.47:9003\" within 1s: context deadline exceeded" Apr 16 14:14:31.449263 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:31.449215 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 14:14:37.094966 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:37.094938 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:14:37.095411 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:37.095067 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:14:39.386635 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:14:39.386603 2580 logging.go:55] [core] [Channel #24 SubChannel #25]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.47:9003", ServerName: "10.134.0.47:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.47:9003: connect: connection refused" Apr 16 14:14:39.514730 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:39.514703 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58_ff5e36b0-d3de-4cbd-8df6-1a647f9e05be/tokenizer/0.log" Apr 16 14:14:39.515471 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:39.515443 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerID="1171af85a4f016dc48c250f47599c8844a422d51fdb0b8887208f7e1c55d75c3" exitCode=137 Apr 16 14:14:39.515602 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:39.515501 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" event={"ID":"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be","Type":"ContainerDied","Data":"1171af85a4f016dc48c250f47599c8844a422d51fdb0b8887208f7e1c55d75c3"} Apr 16 14:14:40.046562 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.046537 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58_ff5e36b0-d3de-4cbd-8df6-1a647f9e05be/tokenizer/0.log" Apr 16 14:14:40.047231 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.047208 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:14:40.105573 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.105535 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-tmp\") pod \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " Apr 16 14:14:40.105766 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.105589 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-cache\") pod \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " Apr 16 14:14:40.105766 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.105646 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kserve-provision-location\") pod \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " Apr 16 14:14:40.105766 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.105707 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-uds\") pod \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " Apr 16 14:14:40.105936 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.105772 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tls-certs\") pod \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " Apr 16 14:14:40.105936 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.105809 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb9z2\" (UniqueName: \"kubernetes.io/projected/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kube-api-access-sb9z2\") pod \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\" (UID: \"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be\") " Apr 16 14:14:40.105936 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.105906 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" (UID: "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:40.106081 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.105943 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" (UID: "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:40.106165 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.106145 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-tmp\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:14:40.106220 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.106173 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:14:40.106220 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.106199 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" (UID: "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:40.106532 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.106474 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" (UID: "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:40.108156 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.108121 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kube-api-access-sb9z2" (OuterVolumeSpecName: "kube-api-access-sb9z2") pod "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" (UID: "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be"). InnerVolumeSpecName "kube-api-access-sb9z2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:14:40.108245 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.108158 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" (UID: "ff5e36b0-d3de-4cbd-8df6-1a647f9e05be"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:14:40.206667 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.206579 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:14:40.206667 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.206610 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sb9z2\" (UniqueName: \"kubernetes.io/projected/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kube-api-access-sb9z2\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:14:40.206667 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.206621 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:14:40.206667 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.206632 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be-tokenizer-uds\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:14:40.386498 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.386451 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.47:9003\" within 1s: context deadline exceeded" Apr 16 14:14:40.521311 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.521222 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58_ff5e36b0-d3de-4cbd-8df6-1a647f9e05be/tokenizer/0.log" Apr 16 14:14:40.521924 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.521902 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" event={"ID":"ff5e36b0-d3de-4cbd-8df6-1a647f9e05be","Type":"ContainerDied","Data":"94ee0bdf20dab6c61bbb9dc88069abd6cb28485805f15aed1c5eafacf093d145"} Apr 16 14:14:40.522001 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.521938 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58" Apr 16 14:14:40.522001 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.521944 2580 scope.go:117] "RemoveContainer" containerID="1171af85a4f016dc48c250f47599c8844a422d51fdb0b8887208f7e1c55d75c3" Apr 16 14:14:40.532229 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.532209 2580 scope.go:117] "RemoveContainer" containerID="0ccbb5d0f021e114aabdba1a9b0690a26ca165463fb850a561a5cdcc5593d340" Apr 16 14:14:40.540882 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.540865 2580 scope.go:117] "RemoveContainer" containerID="d644586b73e6c4ee85f8c65d059ac7e792c85b5505eeb568d6be1374c691e71d" Apr 16 14:14:40.548794 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.548754 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58"] Apr 16 14:14:40.552531 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:40.552505 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5476f9bdvb58"] Apr 16 14:14:41.153133 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:41.153097 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" path="/var/lib/kubelet/pods/ff5e36b0-d3de-4cbd-8df6-1a647f9e05be/volumes" Apr 16 14:14:41.448714 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:41.448620 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 14:14:49.210089 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.210040 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt"] Apr 16 14:14:49.210690 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.210593 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="storage-initializer" Apr 16 14:14:49.210690 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.210613 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="storage-initializer" Apr 16 14:14:49.210690 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.210631 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="tokenizer" Apr 16 14:14:49.210690 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.210640 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="tokenizer" Apr 16 14:14:49.210690 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.210676 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="main" Apr 16 14:14:49.210690 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.210685 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="main" Apr 16 14:14:49.211045 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.210783 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="main" Apr 16 14:14:49.211045 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.210800 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff5e36b0-d3de-4cbd-8df6-1a647f9e05be" containerName="tokenizer" Apr 16 14:14:49.215874 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.215849 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.218862 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.218837 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-round-trip-kserve-self-signed-certs\"" Apr 16 14:14:49.222405 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.222378 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt"] Apr 16 14:14:49.283736 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.283696 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73a6b2d3-598e-4f35-a590-65534b796874-tls-certs\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.283903 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.283758 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-dshm\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.283903 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.283802 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.283903 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.283828 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-model-cache\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.283903 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.283888 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb75r\" (UniqueName: \"kubernetes.io/projected/73a6b2d3-598e-4f35-a590-65534b796874-kube-api-access-fb75r\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.284049 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.283938 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-home\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.384732 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.384696 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73a6b2d3-598e-4f35-a590-65534b796874-tls-certs\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.384732 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.384744 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-dshm\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.384995 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.384764 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.384995 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.384876 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-model-cache\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.384995 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.384915 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fb75r\" (UniqueName: \"kubernetes.io/projected/73a6b2d3-598e-4f35-a590-65534b796874-kube-api-access-fb75r\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.385173 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.385114 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-home\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.385173 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.385152 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.385310 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.385286 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-model-cache\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.385471 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.385449 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-home\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.387300 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.387258 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-dshm\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.387726 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.387703 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73a6b2d3-598e-4f35-a590-65534b796874-tls-certs\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.395791 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.395767 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb75r\" (UniqueName: \"kubernetes.io/projected/73a6b2d3-598e-4f35-a590-65534b796874-kube-api-access-fb75r\") pod \"conv-test-round-trip-kserve-5b4c7988f-bsknt\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.528364 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.528244 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:49.676705 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:49.676678 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt"] Apr 16 14:14:49.680025 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:14:49.679984 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73a6b2d3_598e_4f35_a590_65534b796874.slice/crio-706abbb0179917c3a1081f5b58bf0f6cbacb600407c40ed7514335703c94347e WatchSource:0}: Error finding container 706abbb0179917c3a1081f5b58bf0f6cbacb600407c40ed7514335703c94347e: Status 404 returned error can't find the container with id 706abbb0179917c3a1081f5b58bf0f6cbacb600407c40ed7514335703c94347e Apr 16 14:14:50.568290 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:50.564770 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" event={"ID":"73a6b2d3-598e-4f35-a590-65534b796874","Type":"ContainerStarted","Data":"5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa"} Apr 16 14:14:50.568290 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:50.564819 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" event={"ID":"73a6b2d3-598e-4f35-a590-65534b796874","Type":"ContainerStarted","Data":"706abbb0179917c3a1081f5b58bf0f6cbacb600407c40ed7514335703c94347e"} Apr 16 14:14:51.449093 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:51.449034 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 14:14:54.583788 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:54.583749 2580 generic.go:358] "Generic (PLEG): container finished" podID="73a6b2d3-598e-4f35-a590-65534b796874" containerID="5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa" exitCode=0 Apr 16 14:14:54.584336 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:54.583824 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" event={"ID":"73a6b2d3-598e-4f35-a590-65534b796874","Type":"ContainerDied","Data":"5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa"} Apr 16 14:14:55.590185 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:55.590150 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" event={"ID":"73a6b2d3-598e-4f35-a590-65534b796874","Type":"ContainerStarted","Data":"97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6"} Apr 16 14:14:55.611614 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:55.611561 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" podStartSLOduration=6.611547067 podStartE2EDuration="6.611547067s" podCreationTimestamp="2026-04-16 14:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:14:55.610108692 +0000 UTC m=+919.046274419" watchObservedRunningTime="2026-04-16 14:14:55.611547067 +0000 UTC m=+919.047712795" Apr 16 14:14:56.706445 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.706409 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r"] Apr 16 14:14:56.711646 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.711614 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.714351 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.714323 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 14:14:56.724391 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.724363 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r"] Apr 16 14:14:56.859613 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.859563 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04b87acd-bc56-443f-b89a-d3c9843f3771-tls-certs\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.859613 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.859607 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-home\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.859880 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.859628 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-dshm\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.859880 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.859692 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-model-cache\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.859880 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.859738 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdhhj\" (UniqueName: \"kubernetes.io/projected/04b87acd-bc56-443f-b89a-d3c9843f3771-kube-api-access-zdhhj\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.859880 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.859769 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-kserve-provision-location\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.961399 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.961293 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04b87acd-bc56-443f-b89a-d3c9843f3771-tls-certs\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.961399 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.961356 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-home\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.961399 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.961390 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-dshm\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.961691 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.961434 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-model-cache\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.961691 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.961469 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdhhj\" (UniqueName: \"kubernetes.io/projected/04b87acd-bc56-443f-b89a-d3c9843f3771-kube-api-access-zdhhj\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.961691 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.961511 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-kserve-provision-location\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.961900 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.961868 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-model-cache\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.961983 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.961916 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-kserve-provision-location\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.962129 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.962106 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-home\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.963894 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.963872 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-dshm\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.964745 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.964716 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04b87acd-bc56-443f-b89a-d3c9843f3771-tls-certs\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:56.972174 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:56.972144 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdhhj\" (UniqueName: \"kubernetes.io/projected/04b87acd-bc56-443f-b89a-d3c9843f3771-kube-api-access-zdhhj\") pod \"stop-feature-test-kserve-6d89d568-drk6r\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:57.026282 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:57.026224 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:14:57.191042 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:57.191010 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r"] Apr 16 14:14:57.192824 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:14:57.192794 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04b87acd_bc56_443f_b89a_d3c9843f3771.slice/crio-a5bb6b6f9f6beb07f49607af6f456a68d85abe1237b5d3135ab25406420aae69 WatchSource:0}: Error finding container a5bb6b6f9f6beb07f49607af6f456a68d85abe1237b5d3135ab25406420aae69: Status 404 returned error can't find the container with id a5bb6b6f9f6beb07f49607af6f456a68d85abe1237b5d3135ab25406420aae69 Apr 16 14:14:57.601435 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:57.601392 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" event={"ID":"04b87acd-bc56-443f-b89a-d3c9843f3771","Type":"ContainerStarted","Data":"995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6"} Apr 16 14:14:57.601435 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:57.601441 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" event={"ID":"04b87acd-bc56-443f-b89a-d3c9843f3771","Type":"ContainerStarted","Data":"a5bb6b6f9f6beb07f49607af6f456a68d85abe1237b5d3135ab25406420aae69"} Apr 16 14:14:59.529118 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:59.529077 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:59.529118 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:59.529128 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:14:59.531452 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:59.531404 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" podUID="73a6b2d3-598e-4f35-a590-65534b796874" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 16 14:14:59.934833 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:59.934793 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt"] Apr 16 14:14:59.935337 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:14:59.935166 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" podUID="73a6b2d3-598e-4f35-a590-65534b796874" containerName="main" containerID="cri-o://97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6" gracePeriod=30 Apr 16 14:15:01.449392 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:01.449341 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 14:15:01.624160 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:01.624120 2580 generic.go:358] "Generic (PLEG): container finished" podID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerID="995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6" exitCode=0 Apr 16 14:15:01.624370 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:01.624198 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" event={"ID":"04b87acd-bc56-443f-b89a-d3c9843f3771","Type":"ContainerDied","Data":"995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6"} Apr 16 14:15:02.633989 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:02.633947 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" event={"ID":"04b87acd-bc56-443f-b89a-d3c9843f3771","Type":"ContainerStarted","Data":"4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e"} Apr 16 14:15:02.656260 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:02.656201 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podStartSLOduration=6.656180568 podStartE2EDuration="6.656180568s" podCreationTimestamp="2026-04-16 14:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:15:02.654738491 +0000 UTC m=+926.090904226" watchObservedRunningTime="2026-04-16 14:15:02.656180568 +0000 UTC m=+926.092346299" Apr 16 14:15:07.026965 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:07.026923 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:15:07.026965 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:07.026967 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:15:07.028851 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:07.028801 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 14:15:11.448936 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:11.448888 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 14:15:17.027073 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:17.027000 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 14:15:21.449192 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:21.449145 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 14:15:27.027321 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:27.027249 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 14:15:30.168936 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.168907 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5b4c7988f-bsknt_73a6b2d3-598e-4f35-a590-65534b796874/main/0.log" Apr 16 14:15:30.169387 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.169365 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:15:30.296368 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.296280 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-home\") pod \"73a6b2d3-598e-4f35-a590-65534b796874\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " Apr 16 14:15:30.296368 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.296363 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73a6b2d3-598e-4f35-a590-65534b796874-tls-certs\") pod \"73a6b2d3-598e-4f35-a590-65534b796874\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " Apr 16 14:15:30.296603 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.296400 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-dshm\") pod \"73a6b2d3-598e-4f35-a590-65534b796874\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " Apr 16 14:15:30.296603 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.296469 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-kserve-provision-location\") pod \"73a6b2d3-598e-4f35-a590-65534b796874\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " Apr 16 14:15:30.296603 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.296508 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb75r\" (UniqueName: \"kubernetes.io/projected/73a6b2d3-598e-4f35-a590-65534b796874-kube-api-access-fb75r\") pod \"73a6b2d3-598e-4f35-a590-65534b796874\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " Apr 16 14:15:30.296603 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.296532 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-home" (OuterVolumeSpecName: "home") pod "73a6b2d3-598e-4f35-a590-65534b796874" (UID: "73a6b2d3-598e-4f35-a590-65534b796874"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:30.296603 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.296544 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-model-cache\") pod \"73a6b2d3-598e-4f35-a590-65534b796874\" (UID: \"73a6b2d3-598e-4f35-a590-65534b796874\") " Apr 16 14:15:30.296847 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.296719 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-model-cache" (OuterVolumeSpecName: "model-cache") pod "73a6b2d3-598e-4f35-a590-65534b796874" (UID: "73a6b2d3-598e-4f35-a590-65534b796874"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:30.296908 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.296882 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:15:30.296908 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.296899 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:15:30.298910 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.298887 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-dshm" (OuterVolumeSpecName: "dshm") pod "73a6b2d3-598e-4f35-a590-65534b796874" (UID: "73a6b2d3-598e-4f35-a590-65534b796874"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:30.299026 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.298925 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a6b2d3-598e-4f35-a590-65534b796874-kube-api-access-fb75r" (OuterVolumeSpecName: "kube-api-access-fb75r") pod "73a6b2d3-598e-4f35-a590-65534b796874" (UID: "73a6b2d3-598e-4f35-a590-65534b796874"). InnerVolumeSpecName "kube-api-access-fb75r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:15:30.299119 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.299099 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a6b2d3-598e-4f35-a590-65534b796874-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "73a6b2d3-598e-4f35-a590-65534b796874" (UID: "73a6b2d3-598e-4f35-a590-65534b796874"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:15:30.354055 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.354017 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "73a6b2d3-598e-4f35-a590-65534b796874" (UID: "73a6b2d3-598e-4f35-a590-65534b796874"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:30.397424 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.397389 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:15:30.397424 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.397421 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fb75r\" (UniqueName: \"kubernetes.io/projected/73a6b2d3-598e-4f35-a590-65534b796874-kube-api-access-fb75r\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:15:30.397424 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.397432 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73a6b2d3-598e-4f35-a590-65534b796874-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:15:30.397656 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.397442 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73a6b2d3-598e-4f35-a590-65534b796874-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:15:30.772493 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.772461 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5b4c7988f-bsknt_73a6b2d3-598e-4f35-a590-65534b796874/main/0.log" Apr 16 14:15:30.772876 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.772849 2580 generic.go:358] "Generic (PLEG): container finished" podID="73a6b2d3-598e-4f35-a590-65534b796874" containerID="97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6" exitCode=137 Apr 16 14:15:30.772946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.772901 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" event={"ID":"73a6b2d3-598e-4f35-a590-65534b796874","Type":"ContainerDied","Data":"97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6"} Apr 16 14:15:30.772946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.772938 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" event={"ID":"73a6b2d3-598e-4f35-a590-65534b796874","Type":"ContainerDied","Data":"706abbb0179917c3a1081f5b58bf0f6cbacb600407c40ed7514335703c94347e"} Apr 16 14:15:30.773030 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.772963 2580 scope.go:117] "RemoveContainer" containerID="97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6" Apr 16 14:15:30.773030 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.772997 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt" Apr 16 14:15:30.790659 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.790630 2580 scope.go:117] "RemoveContainer" containerID="5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa" Apr 16 14:15:30.804959 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.804916 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt"] Apr 16 14:15:30.805424 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.805404 2580 scope.go:117] "RemoveContainer" containerID="97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6" Apr 16 14:15:30.805803 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:15:30.805769 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6\": container with ID starting with 97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6 not found: ID does not exist" containerID="97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6" Apr 16 14:15:30.805882 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.805817 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6"} err="failed to get container status \"97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6\": rpc error: code = NotFound desc = could not find container \"97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6\": container with ID starting with 97eb6215a81e3a733b12c62515083391b99d43cbed58cd8b77e74ee20357dbc6 not found: ID does not exist" Apr 16 14:15:30.805882 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.805845 2580 scope.go:117] "RemoveContainer" containerID="5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa" Apr 16 14:15:30.806153 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:15:30.806129 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa\": container with ID starting with 5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa not found: ID does not exist" containerID="5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa" Apr 16 14:15:30.806214 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.806164 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa"} err="failed to get container status \"5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa\": rpc error: code = NotFound desc = could not find container \"5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa\": container with ID starting with 5cfb0f44e741ee2700f6b30225842fcbba5d58234a0008a35bf1577a9b6617fa not found: ID does not exist" Apr 16 14:15:30.807035 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:30.807015 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5b4c7988f-bsknt"] Apr 16 14:15:31.153777 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:31.153744 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a6b2d3-598e-4f35-a590-65534b796874" path="/var/lib/kubelet/pods/73a6b2d3-598e-4f35-a590-65534b796874/volumes" Apr 16 14:15:31.449419 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:31.449314 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 14:15:37.026760 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:37.026710 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 14:15:41.448592 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:41.448535 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 16 14:15:47.027210 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:47.027155 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 14:15:51.458586 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:51.458548 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:15:51.466459 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:51.466424 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:15:57.027234 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:57.027132 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 14:15:58.604529 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:58.604489 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765"] Apr 16 14:15:58.604968 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:15:58.604917 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" containerID="cri-o://6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f" gracePeriod=30 Apr 16 14:16:07.027972 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:07.027924 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 14:16:10.211244 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.211209 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf"] Apr 16 14:16:10.211795 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.211775 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a6b2d3-598e-4f35-a590-65534b796874" containerName="main" Apr 16 14:16:10.211876 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.211799 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a6b2d3-598e-4f35-a590-65534b796874" containerName="main" Apr 16 14:16:10.211876 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.211823 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a6b2d3-598e-4f35-a590-65534b796874" containerName="storage-initializer" Apr 16 14:16:10.211876 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.211831 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a6b2d3-598e-4f35-a590-65534b796874" containerName="storage-initializer" Apr 16 14:16:10.212028 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.211936 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="73a6b2d3-598e-4f35-a590-65534b796874" containerName="main" Apr 16 14:16:10.215377 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.215345 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.217937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.217915 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 14:16:10.227533 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.227483 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf"] Apr 16 14:16:10.384285 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.384239 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-home\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.384285 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.384287 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-model-cache\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.384597 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.384390 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-dshm\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.384597 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.384439 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdtxj\" (UniqueName: \"kubernetes.io/projected/05862ae4-795b-4d25-9f4d-23586bfe097d-kube-api-access-sdtxj\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.384597 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.384496 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05862ae4-795b-4d25-9f4d-23586bfe097d-tls-certs\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.384597 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.384535 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.485953 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.485858 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05862ae4-795b-4d25-9f4d-23586bfe097d-tls-certs\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.485953 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.485911 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.485953 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.485952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-home\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.486240 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.485977 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-model-cache\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.486240 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.486047 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-dshm\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.486240 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.486083 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdtxj\" (UniqueName: \"kubernetes.io/projected/05862ae4-795b-4d25-9f4d-23586bfe097d-kube-api-access-sdtxj\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.486454 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.486427 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-home\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.486567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.486446 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.486567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.486530 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-model-cache\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.488442 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.488415 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-dshm\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.488654 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.488632 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05862ae4-795b-4d25-9f4d-23586bfe097d-tls-certs\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.493904 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.493878 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdtxj\" (UniqueName: \"kubernetes.io/projected/05862ae4-795b-4d25-9f4d-23586bfe097d-kube-api-access-sdtxj\") pod \"custom-route-timeout-test-kserve-9bf4d99d8-mm8cf\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.528859 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.528828 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:10.671247 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.671222 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf"] Apr 16 14:16:10.673736 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:16:10.673709 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05862ae4_795b_4d25_9f4d_23586bfe097d.slice/crio-fb723e02c04990170ebd0c4f84f9871dd06aa4c818f41cc6de2a797029986b1e WatchSource:0}: Error finding container fb723e02c04990170ebd0c4f84f9871dd06aa4c818f41cc6de2a797029986b1e: Status 404 returned error can't find the container with id fb723e02c04990170ebd0c4f84f9871dd06aa4c818f41cc6de2a797029986b1e Apr 16 14:16:10.957153 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.957110 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" event={"ID":"05862ae4-795b-4d25-9f4d-23586bfe097d","Type":"ContainerStarted","Data":"f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b"} Apr 16 14:16:10.957153 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:10.957158 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" event={"ID":"05862ae4-795b-4d25-9f4d-23586bfe097d","Type":"ContainerStarted","Data":"fb723e02c04990170ebd0c4f84f9871dd06aa4c818f41cc6de2a797029986b1e"} Apr 16 14:16:14.976702 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:14.976660 2580 generic.go:358] "Generic (PLEG): container finished" podID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerID="f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b" exitCode=0 Apr 16 14:16:14.977075 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:14.976744 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" event={"ID":"05862ae4-795b-4d25-9f4d-23586bfe097d","Type":"ContainerDied","Data":"f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b"} Apr 16 14:16:15.983332 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:15.983299 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" event={"ID":"05862ae4-795b-4d25-9f4d-23586bfe097d","Type":"ContainerStarted","Data":"c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42"} Apr 16 14:16:16.006050 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:16.005989 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podStartSLOduration=6.005970534 podStartE2EDuration="6.005970534s" podCreationTimestamp="2026-04-16 14:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:16:16.003145819 +0000 UTC m=+999.439311553" watchObservedRunningTime="2026-04-16 14:16:16.005970534 +0000 UTC m=+999.442136262" Apr 16 14:16:17.026963 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:17.026920 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 14:16:20.529352 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:20.529305 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:20.529793 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:20.529368 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:16:20.531194 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:20.531158 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 14:16:27.026685 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:27.026636 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 14:16:29.033022 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.032976 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765_15554d41-66c7-4359-90e3-094b45c2da49/main/0.log" Apr 16 14:16:29.033466 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.033449 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:16:29.036867 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.036842 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765_15554d41-66c7-4359-90e3-094b45c2da49/main/0.log" Apr 16 14:16:29.037231 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.037204 2580 generic.go:358] "Generic (PLEG): container finished" podID="15554d41-66c7-4359-90e3-094b45c2da49" containerID="6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f" exitCode=137 Apr 16 14:16:29.037341 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.037301 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" Apr 16 14:16:29.037412 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.037298 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" event={"ID":"15554d41-66c7-4359-90e3-094b45c2da49","Type":"ContainerDied","Data":"6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f"} Apr 16 14:16:29.037468 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.037416 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765" event={"ID":"15554d41-66c7-4359-90e3-094b45c2da49","Type":"ContainerDied","Data":"059b02d3ca36c14f0a2d6a7f2d05ac9d31d3fbcdf35b30ca64f6d01139c8d8cf"} Apr 16 14:16:29.037468 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.037443 2580 scope.go:117] "RemoveContainer" containerID="6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f" Apr 16 14:16:29.059628 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.059606 2580 scope.go:117] "RemoveContainer" containerID="8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0" Apr 16 14:16:29.060204 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.060172 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-kserve-provision-location\") pod \"15554d41-66c7-4359-90e3-094b45c2da49\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " Apr 16 14:16:29.060467 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.060215 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbdgs\" (UniqueName: \"kubernetes.io/projected/15554d41-66c7-4359-90e3-094b45c2da49-kube-api-access-xbdgs\") pod \"15554d41-66c7-4359-90e3-094b45c2da49\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " Apr 16 14:16:29.060467 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.060246 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-home\") pod \"15554d41-66c7-4359-90e3-094b45c2da49\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " Apr 16 14:16:29.060467 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.060349 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-model-cache\") pod \"15554d41-66c7-4359-90e3-094b45c2da49\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " Apr 16 14:16:29.060467 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.060398 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-dshm\") pod \"15554d41-66c7-4359-90e3-094b45c2da49\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " Apr 16 14:16:29.060467 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.060423 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15554d41-66c7-4359-90e3-094b45c2da49-tls-certs\") pod \"15554d41-66c7-4359-90e3-094b45c2da49\" (UID: \"15554d41-66c7-4359-90e3-094b45c2da49\") " Apr 16 14:16:29.061611 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.060878 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-model-cache" (OuterVolumeSpecName: "model-cache") pod "15554d41-66c7-4359-90e3-094b45c2da49" (UID: "15554d41-66c7-4359-90e3-094b45c2da49"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:29.061611 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.061036 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-home" (OuterVolumeSpecName: "home") pod "15554d41-66c7-4359-90e3-094b45c2da49" (UID: "15554d41-66c7-4359-90e3-094b45c2da49"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:29.064355 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.063955 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15554d41-66c7-4359-90e3-094b45c2da49-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "15554d41-66c7-4359-90e3-094b45c2da49" (UID: "15554d41-66c7-4359-90e3-094b45c2da49"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:16:29.064485 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.064455 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15554d41-66c7-4359-90e3-094b45c2da49-kube-api-access-xbdgs" (OuterVolumeSpecName: "kube-api-access-xbdgs") pod "15554d41-66c7-4359-90e3-094b45c2da49" (UID: "15554d41-66c7-4359-90e3-094b45c2da49"). InnerVolumeSpecName "kube-api-access-xbdgs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:16:29.064549 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.064508 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-dshm" (OuterVolumeSpecName: "dshm") pod "15554d41-66c7-4359-90e3-094b45c2da49" (UID: "15554d41-66c7-4359-90e3-094b45c2da49"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:29.072551 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.072369 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "15554d41-66c7-4359-90e3-094b45c2da49" (UID: "15554d41-66c7-4359-90e3-094b45c2da49"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:29.105849 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.105820 2580 scope.go:117] "RemoveContainer" containerID="6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f" Apr 16 14:16:29.106198 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:16:29.106177 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f\": container with ID starting with 6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f not found: ID does not exist" containerID="6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f" Apr 16 14:16:29.106262 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.106212 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f"} err="failed to get container status \"6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f\": rpc error: code = NotFound desc = could not find container \"6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f\": container with ID starting with 6e9dad323f1be8b63a468c04c01928998c15eb0a2f2f34bbe79307caf31bee3f not found: ID does not exist" Apr 16 14:16:29.106262 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.106232 2580 scope.go:117] "RemoveContainer" containerID="8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0" Apr 16 14:16:29.106529 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:16:29.106507 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0\": container with ID starting with 8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0 not found: ID does not exist" containerID="8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0" Apr 16 14:16:29.106577 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.106538 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0"} err="failed to get container status \"8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0\": rpc error: code = NotFound desc = could not find container \"8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0\": container with ID starting with 8004466a8cc7dbd45dfcc5754860d0ac7fee9798a12053383f11cf0d8c2abcf0 not found: ID does not exist" Apr 16 14:16:29.162294 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.162247 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:16:29.162391 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.162298 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:16:29.162391 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.162312 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15554d41-66c7-4359-90e3-094b45c2da49-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:16:29.162391 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.162321 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:16:29.162391 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.162331 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xbdgs\" (UniqueName: \"kubernetes.io/projected/15554d41-66c7-4359-90e3-094b45c2da49-kube-api-access-xbdgs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:16:29.162391 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.162340 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/15554d41-66c7-4359-90e3-094b45c2da49-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:16:29.356383 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.356351 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765"] Apr 16 14:16:29.360118 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:29.360092 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-6b679d9774k7765"] Apr 16 14:16:30.529781 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:30.529731 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 14:16:31.155900 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:31.155855 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15554d41-66c7-4359-90e3-094b45c2da49" path="/var/lib/kubelet/pods/15554d41-66c7-4359-90e3-094b45c2da49/volumes" Apr 16 14:16:37.027726 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:37.027683 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8000/health\": dial tcp 10.134.0.49:8000: connect: connection refused" Apr 16 14:16:40.529357 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:40.529308 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 14:16:47.036804 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:47.036770 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:16:47.044823 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:47.044798 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:16:47.997577 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:47.997544 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r"] Apr 16 14:16:48.119388 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:48.119345 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" containerID="cri-o://4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e" gracePeriod=30 Apr 16 14:16:50.530320 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:50.530276 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 14:16:58.024459 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.024408 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb"] Apr 16 14:16:58.024932 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.024890 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" Apr 16 14:16:58.024932 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.024906 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" Apr 16 14:16:58.024932 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.024934 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="storage-initializer" Apr 16 14:16:58.025104 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.024939 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="storage-initializer" Apr 16 14:16:58.025104 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.024994 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="15554d41-66c7-4359-90e3-094b45c2da49" containerName="main" Apr 16 14:16:58.030191 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.030169 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.036977 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.036948 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb"] Apr 16 14:16:58.125307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.125253 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-model-cache\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.125478 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.125352 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-dshm\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.125478 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.125462 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-home\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.125561 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.125501 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-kserve-provision-location\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.125608 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.125560 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqwck\" (UniqueName: \"kubernetes.io/projected/24a88095-6ade-4d31-8892-caa3a85d2ff2-kube-api-access-bqwck\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.125608 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.125596 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24a88095-6ade-4d31-8892-caa3a85d2ff2-tls-certs\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.226779 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.226747 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-model-cache\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.226979 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.226799 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-dshm\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.226979 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.226847 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-home\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.226979 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.226868 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-kserve-provision-location\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.226979 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.226900 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqwck\" (UniqueName: \"kubernetes.io/projected/24a88095-6ade-4d31-8892-caa3a85d2ff2-kube-api-access-bqwck\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.227186 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.227029 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24a88095-6ade-4d31-8892-caa3a85d2ff2-tls-certs\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.227186 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.227159 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-model-cache\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.227328 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.227232 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-kserve-provision-location\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.227328 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.227312 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-home\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.229278 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.229249 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-dshm\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.229627 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.229611 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24a88095-6ade-4d31-8892-caa3a85d2ff2-tls-certs\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.235194 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.235174 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqwck\" (UniqueName: \"kubernetes.io/projected/24a88095-6ade-4d31-8892-caa3a85d2ff2-kube-api-access-bqwck\") pod \"stop-feature-test-kserve-6d89d568-grzzb\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.344974 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.344876 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:16:58.496496 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.496449 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb"] Apr 16 14:16:58.496835 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:16:58.496804 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24a88095_6ade_4d31_8892_caa3a85d2ff2.slice/crio-abeb8f3280da521ccc67e62d4fdb8f458958ecef2e113c84759e172489832174 WatchSource:0}: Error finding container abeb8f3280da521ccc67e62d4fdb8f458958ecef2e113c84759e172489832174: Status 404 returned error can't find the container with id abeb8f3280da521ccc67e62d4fdb8f458958ecef2e113c84759e172489832174 Apr 16 14:16:58.499158 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:58.499138 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:16:59.162227 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:59.162184 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" event={"ID":"24a88095-6ade-4d31-8892-caa3a85d2ff2","Type":"ContainerStarted","Data":"dac18a23d3646e37eaf6ae7d1ab513b02823c4f8371a80e2c4f4c0b419d1ad27"} Apr 16 14:16:59.162227 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:16:59.162227 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" event={"ID":"24a88095-6ade-4d31-8892-caa3a85d2ff2","Type":"ContainerStarted","Data":"abeb8f3280da521ccc67e62d4fdb8f458958ecef2e113c84759e172489832174"} Apr 16 14:17:00.529602 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:00.529542 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 14:17:03.188822 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:03.188785 2580 generic.go:358] "Generic (PLEG): container finished" podID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerID="dac18a23d3646e37eaf6ae7d1ab513b02823c4f8371a80e2c4f4c0b419d1ad27" exitCode=0 Apr 16 14:17:03.189208 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:03.188837 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" event={"ID":"24a88095-6ade-4d31-8892-caa3a85d2ff2","Type":"ContainerDied","Data":"dac18a23d3646e37eaf6ae7d1ab513b02823c4f8371a80e2c4f4c0b419d1ad27"} Apr 16 14:17:04.195491 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:04.195448 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" event={"ID":"24a88095-6ade-4d31-8892-caa3a85d2ff2","Type":"ContainerStarted","Data":"8dac90c7be356b05e7dc1dc70d24675eea720002d7118f81c061256788c8528e"} Apr 16 14:17:04.217941 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:04.217883 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podStartSLOduration=6.217865599 podStartE2EDuration="6.217865599s" podCreationTimestamp="2026-04-16 14:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:17:04.214413359 +0000 UTC m=+1047.650579088" watchObservedRunningTime="2026-04-16 14:17:04.217865599 +0000 UTC m=+1047.654031321" Apr 16 14:17:08.345412 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:08.345367 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:17:08.345913 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:08.345423 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:17:08.346851 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:08.346820 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 14:17:10.530339 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:10.530291 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 14:17:18.345686 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.345640 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 14:17:18.545873 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.545850 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6d89d568-drk6r_04b87acd-bc56-443f-b89a-d3c9843f3771/main/0.log" Apr 16 14:17:18.546290 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.546254 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:17:18.610484 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.610452 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-home\") pod \"04b87acd-bc56-443f-b89a-d3c9843f3771\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " Apr 16 14:17:18.610664 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.610499 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-dshm\") pod \"04b87acd-bc56-443f-b89a-d3c9843f3771\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " Apr 16 14:17:18.610892 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.610861 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-home" (OuterVolumeSpecName: "home") pod "04b87acd-bc56-443f-b89a-d3c9843f3771" (UID: "04b87acd-bc56-443f-b89a-d3c9843f3771"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:18.612821 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.612791 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-dshm" (OuterVolumeSpecName: "dshm") pod "04b87acd-bc56-443f-b89a-d3c9843f3771" (UID: "04b87acd-bc56-443f-b89a-d3c9843f3771"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:18.711581 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.711477 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-model-cache\") pod \"04b87acd-bc56-443f-b89a-d3c9843f3771\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " Apr 16 14:17:18.711581 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.711537 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04b87acd-bc56-443f-b89a-d3c9843f3771-tls-certs\") pod \"04b87acd-bc56-443f-b89a-d3c9843f3771\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " Apr 16 14:17:18.711847 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.711658 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdhhj\" (UniqueName: \"kubernetes.io/projected/04b87acd-bc56-443f-b89a-d3c9843f3771-kube-api-access-zdhhj\") pod \"04b87acd-bc56-443f-b89a-d3c9843f3771\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " Apr 16 14:17:18.711847 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.711691 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-kserve-provision-location\") pod \"04b87acd-bc56-443f-b89a-d3c9843f3771\" (UID: \"04b87acd-bc56-443f-b89a-d3c9843f3771\") " Apr 16 14:17:18.711847 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.711772 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-model-cache" (OuterVolumeSpecName: "model-cache") pod "04b87acd-bc56-443f-b89a-d3c9843f3771" (UID: "04b87acd-bc56-443f-b89a-d3c9843f3771"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:18.712014 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.711982 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:17:18.712014 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.712001 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:17:18.712125 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.712015 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:17:18.714367 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.714341 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b87acd-bc56-443f-b89a-d3c9843f3771-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "04b87acd-bc56-443f-b89a-d3c9843f3771" (UID: "04b87acd-bc56-443f-b89a-d3c9843f3771"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:17:18.714835 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.714806 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b87acd-bc56-443f-b89a-d3c9843f3771-kube-api-access-zdhhj" (OuterVolumeSpecName: "kube-api-access-zdhhj") pod "04b87acd-bc56-443f-b89a-d3c9843f3771" (UID: "04b87acd-bc56-443f-b89a-d3c9843f3771"). InnerVolumeSpecName "kube-api-access-zdhhj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:17:18.739592 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.739532 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "04b87acd-bc56-443f-b89a-d3c9843f3771" (UID: "04b87acd-bc56-443f-b89a-d3c9843f3771"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:18.812505 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.812465 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdhhj\" (UniqueName: \"kubernetes.io/projected/04b87acd-bc56-443f-b89a-d3c9843f3771-kube-api-access-zdhhj\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:17:18.812505 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.812494 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04b87acd-bc56-443f-b89a-d3c9843f3771-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:17:18.812505 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:18.812503 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04b87acd-bc56-443f-b89a-d3c9843f3771-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:17:19.259991 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.259943 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6d89d568-drk6r_04b87acd-bc56-443f-b89a-d3c9843f3771/main/0.log" Apr 16 14:17:19.260355 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.260329 2580 generic.go:358] "Generic (PLEG): container finished" podID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerID="4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e" exitCode=137 Apr 16 14:17:19.260441 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.260390 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" event={"ID":"04b87acd-bc56-443f-b89a-d3c9843f3771","Type":"ContainerDied","Data":"4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e"} Apr 16 14:17:19.260441 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.260416 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" Apr 16 14:17:19.260548 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.260443 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r" event={"ID":"04b87acd-bc56-443f-b89a-d3c9843f3771","Type":"ContainerDied","Data":"a5bb6b6f9f6beb07f49607af6f456a68d85abe1237b5d3135ab25406420aae69"} Apr 16 14:17:19.260548 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.260461 2580 scope.go:117] "RemoveContainer" containerID="4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e" Apr 16 14:17:19.281975 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.281938 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r"] Apr 16 14:17:19.287592 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.287562 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-drk6r"] Apr 16 14:17:19.291224 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.291204 2580 scope.go:117] "RemoveContainer" containerID="995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6" Apr 16 14:17:19.319925 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.319897 2580 scope.go:117] "RemoveContainer" containerID="4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e" Apr 16 14:17:19.320484 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:17:19.320459 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e\": container with ID starting with 4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e not found: ID does not exist" containerID="4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e" Apr 16 14:17:19.320612 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.320493 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e"} err="failed to get container status \"4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e\": rpc error: code = NotFound desc = could not find container \"4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e\": container with ID starting with 4e1d8e0db4b92b115d1198e46c6540b9211629610c728b15379ec0d9ffe6309e not found: ID does not exist" Apr 16 14:17:19.320612 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.320513 2580 scope.go:117] "RemoveContainer" containerID="995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6" Apr 16 14:17:19.320849 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:17:19.320832 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6\": container with ID starting with 995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6 not found: ID does not exist" containerID="995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6" Apr 16 14:17:19.320905 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:19.320855 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6"} err="failed to get container status \"995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6\": rpc error: code = NotFound desc = could not find container \"995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6\": container with ID starting with 995787a1ce0d241e6db28e5511701428d312bb6141453edb14e4a6be2af676e6 not found: ID does not exist" Apr 16 14:17:20.530286 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:20.530223 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 14:17:21.154091 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:21.154051 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" path="/var/lib/kubelet/pods/04b87acd-bc56-443f-b89a-d3c9843f3771/volumes" Apr 16 14:17:28.345446 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:28.345343 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 14:17:30.530118 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:30.530071 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 14:17:38.346125 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:38.346077 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 14:17:40.529736 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:40.529684 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 14:17:48.345718 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:48.345669 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 14:17:50.529534 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:50.529482 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 16 14:17:58.346282 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:17:58.346230 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 14:18:00.539504 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:00.539472 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:18:00.547502 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:00.547478 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:18:08.345817 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:08.345767 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 14:18:10.628026 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:10.627987 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf"] Apr 16 14:18:10.628515 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:10.628299 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" containerID="cri-o://c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42" gracePeriod=30 Apr 16 14:18:16.526977 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.526943 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz"] Apr 16 14:18:16.527394 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.527348 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" Apr 16 14:18:16.527394 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.527361 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" Apr 16 14:18:16.527394 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.527375 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="storage-initializer" Apr 16 14:18:16.527394 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.527381 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="storage-initializer" Apr 16 14:18:16.527540 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.527454 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="04b87acd-bc56-443f-b89a-d3c9843f3771" containerName="main" Apr 16 14:18:16.531934 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.531917 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.534442 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.534420 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 14:18:16.541852 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.541818 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz"] Apr 16 14:18:16.657795 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.657747 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-model-cache\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.657989 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.657831 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03b5da37-52cc-4b28-aeda-ef529e65711d-tls-certs\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.657989 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.657867 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-kserve-provision-location\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.657989 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.657896 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-dshm\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.657989 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.657952 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl7hw\" (UniqueName: \"kubernetes.io/projected/03b5da37-52cc-4b28-aeda-ef529e65711d-kube-api-access-fl7hw\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.658138 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.658023 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-home\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.759545 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.759497 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-model-cache\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.759750 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.759589 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03b5da37-52cc-4b28-aeda-ef529e65711d-tls-certs\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.759750 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.759623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-kserve-provision-location\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.759750 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.759652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-dshm\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.759750 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.759679 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl7hw\" (UniqueName: \"kubernetes.io/projected/03b5da37-52cc-4b28-aeda-ef529e65711d-kube-api-access-fl7hw\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.759750 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.759741 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-home\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.760043 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.759971 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-model-cache\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.760100 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.760049 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-kserve-provision-location\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.760213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.760182 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-home\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.762115 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.762095 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-dshm\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.762422 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.762400 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03b5da37-52cc-4b28-aeda-ef529e65711d-tls-certs\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.768700 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.768677 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl7hw\" (UniqueName: \"kubernetes.io/projected/03b5da37-52cc-4b28-aeda-ef529e65711d-kube-api-access-fl7hw\") pod \"router-with-refs-test-kserve-575d96cb88-zfchz\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.843392 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.843291 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:16.981348 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:16.981323 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz"] Apr 16 14:18:16.983622 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:18:16.983578 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b5da37_52cc_4b28_aeda_ef529e65711d.slice/crio-b2b547b2e2f11d6f439adb3efa8a245dc0c29eef2e5681368dc0618c4b5160b8 WatchSource:0}: Error finding container b2b547b2e2f11d6f439adb3efa8a245dc0c29eef2e5681368dc0618c4b5160b8: Status 404 returned error can't find the container with id b2b547b2e2f11d6f439adb3efa8a245dc0c29eef2e5681368dc0618c4b5160b8 Apr 16 14:18:17.508017 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:17.507977 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" event={"ID":"03b5da37-52cc-4b28-aeda-ef529e65711d","Type":"ContainerStarted","Data":"6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333"} Apr 16 14:18:17.508017 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:17.508024 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" event={"ID":"03b5da37-52cc-4b28-aeda-ef529e65711d","Type":"ContainerStarted","Data":"b2b547b2e2f11d6f439adb3efa8a245dc0c29eef2e5681368dc0618c4b5160b8"} Apr 16 14:18:18.345843 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:18.345800 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 14:18:22.531800 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:22.531767 2580 generic.go:358] "Generic (PLEG): container finished" podID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerID="6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333" exitCode=0 Apr 16 14:18:22.532201 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:22.531844 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" event={"ID":"03b5da37-52cc-4b28-aeda-ef529e65711d","Type":"ContainerDied","Data":"6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333"} Apr 16 14:18:23.538103 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:23.538065 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" event={"ID":"03b5da37-52cc-4b28-aeda-ef529e65711d","Type":"ContainerStarted","Data":"c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f"} Apr 16 14:18:23.560009 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:23.559953 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podStartSLOduration=7.559938187 podStartE2EDuration="7.559938187s" podCreationTimestamp="2026-04-16 14:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:18:23.55654637 +0000 UTC m=+1126.992712100" watchObservedRunningTime="2026-04-16 14:18:23.559938187 +0000 UTC m=+1126.996103974" Apr 16 14:18:26.844131 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:26.844096 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:26.844851 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:26.844147 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:18:26.846007 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:26.845977 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 16 14:18:28.346024 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:28.345977 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 14:18:36.844057 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:36.843999 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 16 14:18:38.345966 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:38.345919 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 16 14:18:40.920375 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:40.920340 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-9bf4d99d8-mm8cf_05862ae4-795b-4d25-9f4d-23586bfe097d/main/0.log" Apr 16 14:18:40.920767 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:40.920750 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:18:41.096602 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.096561 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-model-cache\") pod \"05862ae4-795b-4d25-9f4d-23586bfe097d\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " Apr 16 14:18:41.096800 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.096614 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-kserve-provision-location\") pod \"05862ae4-795b-4d25-9f4d-23586bfe097d\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " Apr 16 14:18:41.096800 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.096705 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdtxj\" (UniqueName: \"kubernetes.io/projected/05862ae4-795b-4d25-9f4d-23586bfe097d-kube-api-access-sdtxj\") pod \"05862ae4-795b-4d25-9f4d-23586bfe097d\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " Apr 16 14:18:41.096800 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.096732 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05862ae4-795b-4d25-9f4d-23586bfe097d-tls-certs\") pod \"05862ae4-795b-4d25-9f4d-23586bfe097d\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " Apr 16 14:18:41.096984 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.096827 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-dshm\") pod \"05862ae4-795b-4d25-9f4d-23586bfe097d\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " Apr 16 14:18:41.096984 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.096867 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-home\") pod \"05862ae4-795b-4d25-9f4d-23586bfe097d\" (UID: \"05862ae4-795b-4d25-9f4d-23586bfe097d\") " Apr 16 14:18:41.097213 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.097155 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-model-cache" (OuterVolumeSpecName: "model-cache") pod "05862ae4-795b-4d25-9f4d-23586bfe097d" (UID: "05862ae4-795b-4d25-9f4d-23586bfe097d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:41.097565 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.097508 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-home" (OuterVolumeSpecName: "home") pod "05862ae4-795b-4d25-9f4d-23586bfe097d" (UID: "05862ae4-795b-4d25-9f4d-23586bfe097d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:41.099136 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.099100 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-dshm" (OuterVolumeSpecName: "dshm") pod "05862ae4-795b-4d25-9f4d-23586bfe097d" (UID: "05862ae4-795b-4d25-9f4d-23586bfe097d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:41.099316 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.099292 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05862ae4-795b-4d25-9f4d-23586bfe097d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "05862ae4-795b-4d25-9f4d-23586bfe097d" (UID: "05862ae4-795b-4d25-9f4d-23586bfe097d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:18:41.099700 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.099678 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05862ae4-795b-4d25-9f4d-23586bfe097d-kube-api-access-sdtxj" (OuterVolumeSpecName: "kube-api-access-sdtxj") pod "05862ae4-795b-4d25-9f4d-23586bfe097d" (UID: "05862ae4-795b-4d25-9f4d-23586bfe097d"). InnerVolumeSpecName "kube-api-access-sdtxj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:18:41.166932 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.166843 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "05862ae4-795b-4d25-9f4d-23586bfe097d" (UID: "05862ae4-795b-4d25-9f4d-23586bfe097d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:41.197818 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.197770 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:18:41.197818 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.197818 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:18:41.198014 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.197833 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:18:41.198014 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.197850 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05862ae4-795b-4d25-9f4d-23586bfe097d-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:18:41.198014 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.197865 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdtxj\" (UniqueName: \"kubernetes.io/projected/05862ae4-795b-4d25-9f4d-23586bfe097d-kube-api-access-sdtxj\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:18:41.198014 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.197882 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05862ae4-795b-4d25-9f4d-23586bfe097d-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:18:41.616422 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.616399 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-9bf4d99d8-mm8cf_05862ae4-795b-4d25-9f4d-23586bfe097d/main/0.log" Apr 16 14:18:41.616790 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.616763 2580 generic.go:358] "Generic (PLEG): container finished" podID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerID="c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42" exitCode=137 Apr 16 14:18:41.616876 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.616857 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" Apr 16 14:18:41.616876 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.616848 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" event={"ID":"05862ae4-795b-4d25-9f4d-23586bfe097d","Type":"ContainerDied","Data":"c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42"} Apr 16 14:18:41.616986 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.616902 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf" event={"ID":"05862ae4-795b-4d25-9f4d-23586bfe097d","Type":"ContainerDied","Data":"fb723e02c04990170ebd0c4f84f9871dd06aa4c818f41cc6de2a797029986b1e"} Apr 16 14:18:41.616986 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.616925 2580 scope.go:117] "RemoveContainer" containerID="c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42" Apr 16 14:18:41.646117 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.646085 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf"] Apr 16 14:18:41.650545 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.650489 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-9bf4d99d8-mm8cf"] Apr 16 14:18:41.650900 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.650789 2580 scope.go:117] "RemoveContainer" containerID="f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b" Apr 16 14:18:41.718564 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.718539 2580 scope.go:117] "RemoveContainer" containerID="c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42" Apr 16 14:18:41.718929 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:18:41.718910 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42\": container with ID starting with c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42 not found: ID does not exist" containerID="c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42" Apr 16 14:18:41.718992 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.718939 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42"} err="failed to get container status \"c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42\": rpc error: code = NotFound desc = could not find container \"c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42\": container with ID starting with c6d07c7db7ed76e6c0311aca30e6d883cf836e44a7ec23240e4dfa41bc8eef42 not found: ID does not exist" Apr 16 14:18:41.718992 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.718959 2580 scope.go:117] "RemoveContainer" containerID="f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b" Apr 16 14:18:41.719287 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:18:41.719247 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b\": container with ID starting with f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b not found: ID does not exist" containerID="f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b" Apr 16 14:18:41.719369 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:41.719297 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b"} err="failed to get container status \"f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b\": rpc error: code = NotFound desc = could not find container \"f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b\": container with ID starting with f12b4f5af6124ef26734f909e9b62b41676f6864e65912359bbf0b0977cc661b not found: ID does not exist" Apr 16 14:18:43.152937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:43.152906 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" path="/var/lib/kubelet/pods/05862ae4-795b-4d25-9f4d-23586bfe097d/volumes" Apr 16 14:18:46.844479 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:46.844422 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 16 14:18:48.356091 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:48.356059 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:18:48.363946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:48.363917 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:18:50.516603 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:50.516562 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb"] Apr 16 14:18:50.517631 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:50.517579 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" containerID="cri-o://8dac90c7be356b05e7dc1dc70d24675eea720002d7118f81c061256788c8528e" gracePeriod=30 Apr 16 14:18:56.844067 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:18:56.843969 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 16 14:19:06.843857 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:06.843809 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 16 14:19:16.844606 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:16.844561 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 16 14:19:20.796040 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.795955 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6d89d568-grzzb_24a88095-6ade-4d31-8892-caa3a85d2ff2/main/0.log" Apr 16 14:19:20.796557 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.796349 2580 generic.go:358] "Generic (PLEG): container finished" podID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerID="8dac90c7be356b05e7dc1dc70d24675eea720002d7118f81c061256788c8528e" exitCode=137 Apr 16 14:19:20.796557 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.796438 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" event={"ID":"24a88095-6ade-4d31-8892-caa3a85d2ff2","Type":"ContainerDied","Data":"8dac90c7be356b05e7dc1dc70d24675eea720002d7118f81c061256788c8528e"} Apr 16 14:19:20.827882 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.827861 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6d89d568-grzzb_24a88095-6ade-4d31-8892-caa3a85d2ff2/main/0.log" Apr 16 14:19:20.828286 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.828248 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:19:20.957999 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.957956 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqwck\" (UniqueName: \"kubernetes.io/projected/24a88095-6ade-4d31-8892-caa3a85d2ff2-kube-api-access-bqwck\") pod \"24a88095-6ade-4d31-8892-caa3a85d2ff2\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " Apr 16 14:19:20.958194 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.958009 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-kserve-provision-location\") pod \"24a88095-6ade-4d31-8892-caa3a85d2ff2\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " Apr 16 14:19:20.958194 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.958033 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24a88095-6ade-4d31-8892-caa3a85d2ff2-tls-certs\") pod \"24a88095-6ade-4d31-8892-caa3a85d2ff2\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " Apr 16 14:19:20.958194 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.958110 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-dshm\") pod \"24a88095-6ade-4d31-8892-caa3a85d2ff2\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " Apr 16 14:19:20.958194 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.958167 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-home\") pod \"24a88095-6ade-4d31-8892-caa3a85d2ff2\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " Apr 16 14:19:20.958194 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.958194 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-model-cache\") pod \"24a88095-6ade-4d31-8892-caa3a85d2ff2\" (UID: \"24a88095-6ade-4d31-8892-caa3a85d2ff2\") " Apr 16 14:19:20.958643 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.958616 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-model-cache" (OuterVolumeSpecName: "model-cache") pod "24a88095-6ade-4d31-8892-caa3a85d2ff2" (UID: "24a88095-6ade-4d31-8892-caa3a85d2ff2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:20.958812 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.958634 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-home" (OuterVolumeSpecName: "home") pod "24a88095-6ade-4d31-8892-caa3a85d2ff2" (UID: "24a88095-6ade-4d31-8892-caa3a85d2ff2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:20.960935 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.960898 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-dshm" (OuterVolumeSpecName: "dshm") pod "24a88095-6ade-4d31-8892-caa3a85d2ff2" (UID: "24a88095-6ade-4d31-8892-caa3a85d2ff2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:20.961046 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.960972 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a88095-6ade-4d31-8892-caa3a85d2ff2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "24a88095-6ade-4d31-8892-caa3a85d2ff2" (UID: "24a88095-6ade-4d31-8892-caa3a85d2ff2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:19:20.961046 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:20.960985 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a88095-6ade-4d31-8892-caa3a85d2ff2-kube-api-access-bqwck" (OuterVolumeSpecName: "kube-api-access-bqwck") pod "24a88095-6ade-4d31-8892-caa3a85d2ff2" (UID: "24a88095-6ade-4d31-8892-caa3a85d2ff2"). InnerVolumeSpecName "kube-api-access-bqwck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:19:21.013503 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.013461 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "24a88095-6ade-4d31-8892-caa3a85d2ff2" (UID: "24a88095-6ade-4d31-8892-caa3a85d2ff2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:21.059373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.059288 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:19:21.059373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.059318 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:19:21.059373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.059328 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:19:21.059373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.059339 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqwck\" (UniqueName: \"kubernetes.io/projected/24a88095-6ade-4d31-8892-caa3a85d2ff2-kube-api-access-bqwck\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:19:21.059373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.059350 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24a88095-6ade-4d31-8892-caa3a85d2ff2-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:19:21.059373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.059363 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24a88095-6ade-4d31-8892-caa3a85d2ff2-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:19:21.803109 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.803073 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6d89d568-grzzb_24a88095-6ade-4d31-8892-caa3a85d2ff2/main/0.log" Apr 16 14:19:21.803647 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.803492 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" event={"ID":"24a88095-6ade-4d31-8892-caa3a85d2ff2","Type":"ContainerDied","Data":"abeb8f3280da521ccc67e62d4fdb8f458958ecef2e113c84759e172489832174"} Apr 16 14:19:21.803647 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.803543 2580 scope.go:117] "RemoveContainer" containerID="8dac90c7be356b05e7dc1dc70d24675eea720002d7118f81c061256788c8528e" Apr 16 14:19:21.803647 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.803546 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb" Apr 16 14:19:21.824299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.824261 2580 scope.go:117] "RemoveContainer" containerID="dac18a23d3646e37eaf6ae7d1ab513b02823c4f8371a80e2c4f4c0b419d1ad27" Apr 16 14:19:21.824937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.824913 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb"] Apr 16 14:19:21.828943 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:21.828913 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6d89d568-grzzb"] Apr 16 14:19:23.153221 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:23.153188 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" path="/var/lib/kubelet/pods/24a88095-6ade-4d31-8892-caa3a85d2ff2/volumes" Apr 16 14:19:26.844121 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:26.844071 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 16 14:19:36.844702 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:36.844655 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 16 14:19:37.131497 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:37.131469 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:19:37.131701 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:37.131535 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:19:46.843966 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:46.843922 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 16 14:19:56.844393 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:19:56.844342 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 16 14:20:00.085694 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.085656 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd"] Apr 16 14:20:00.086380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.086113 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" Apr 16 14:20:00.086380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.086131 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" Apr 16 14:20:00.086380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.086145 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="storage-initializer" Apr 16 14:20:00.086380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.086156 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="storage-initializer" Apr 16 14:20:00.086380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.086176 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="storage-initializer" Apr 16 14:20:00.086380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.086184 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="storage-initializer" Apr 16 14:20:00.086380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.086209 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" Apr 16 14:20:00.086380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.086217 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" Apr 16 14:20:00.086380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.086319 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="05862ae4-795b-4d25-9f4d-23586bfe097d" containerName="main" Apr 16 14:20:00.086380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.086334 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="24a88095-6ade-4d31-8892-caa3a85d2ff2" containerName="main" Apr 16 14:20:00.089839 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.089818 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.092215 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.092194 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 14:20:00.099628 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.099601 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd"] Apr 16 14:20:00.208600 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.208552 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.208600 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.208600 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.208829 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.208629 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.208829 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.208654 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.208829 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.208734 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.208829 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.208785 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgswc\" (UniqueName: \"kubernetes.io/projected/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kube-api-access-fgswc\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.309316 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.309282 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.309519 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.309335 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.309519 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.309368 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.309519 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.309390 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.309519 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.309449 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.309519 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.309506 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgswc\" (UniqueName: \"kubernetes.io/projected/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kube-api-access-fgswc\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.309814 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.309658 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.309814 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.309775 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.309920 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.309895 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.311845 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.311819 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.312617 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.312596 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.317815 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.317790 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgswc\" (UniqueName: \"kubernetes.io/projected/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kube-api-access-fgswc\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.402760 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.402718 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:00.549225 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.549191 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd"] Apr 16 14:20:00.552693 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:20:00.552546 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ce643ef_ffb2_4bae_8b9f_03a76bcddada.slice/crio-fcf5819b3fcc3d82ae0bf9a14ece382ddbcc1e6f78d1b7365f4da55e323f4e6e WatchSource:0}: Error finding container fcf5819b3fcc3d82ae0bf9a14ece382ddbcc1e6f78d1b7365f4da55e323f4e6e: Status 404 returned error can't find the container with id fcf5819b3fcc3d82ae0bf9a14ece382ddbcc1e6f78d1b7365f4da55e323f4e6e Apr 16 14:20:00.971325 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.971260 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" event={"ID":"5ce643ef-ffb2-4bae-8b9f-03a76bcddada","Type":"ContainerStarted","Data":"10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b"} Apr 16 14:20:00.971522 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:00.971335 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" event={"ID":"5ce643ef-ffb2-4bae-8b9f-03a76bcddada","Type":"ContainerStarted","Data":"fcf5819b3fcc3d82ae0bf9a14ece382ddbcc1e6f78d1b7365f4da55e323f4e6e"} Apr 16 14:20:05.993336 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:05.993301 2580 generic.go:358] "Generic (PLEG): container finished" podID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerID="10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b" exitCode=0 Apr 16 14:20:05.993735 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:05.993364 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" event={"ID":"5ce643ef-ffb2-4bae-8b9f-03a76bcddada","Type":"ContainerDied","Data":"10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b"} Apr 16 14:20:06.858609 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:06.858577 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:20:06.866667 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:06.866636 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:20:07.000811 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:07.000774 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" event={"ID":"5ce643ef-ffb2-4bae-8b9f-03a76bcddada","Type":"ContainerStarted","Data":"8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d"} Apr 16 14:20:07.023550 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:07.023497 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podStartSLOduration=7.023483299 podStartE2EDuration="7.023483299s" podCreationTimestamp="2026-04-16 14:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:07.021861905 +0000 UTC m=+1230.458027648" watchObservedRunningTime="2026-04-16 14:20:07.023483299 +0000 UTC m=+1230.459649027" Apr 16 14:20:10.403769 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:10.403717 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:10.403769 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:10.403772 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:20:10.405400 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:10.405369 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 14:20:20.403326 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:20.403254 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 14:20:30.404078 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:30.403981 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 14:20:37.182380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:37.182339 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz"] Apr 16 14:20:37.182979 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:37.182749 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" containerID="cri-o://c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f" gracePeriod=30 Apr 16 14:20:40.403386 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:40.403326 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 14:20:46.618946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.618913 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6"] Apr 16 14:20:46.623924 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.623905 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.626800 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.626776 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 14:20:46.626917 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.626835 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-j4jks\"" Apr 16 14:20:46.644613 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.644572 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6"] Apr 16 14:20:46.655335 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.655299 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz"] Apr 16 14:20:46.660152 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.660126 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz"] Apr 16 14:20:46.660364 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.660338 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.723185 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723145 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.723185 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723190 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.723512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723220 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.723512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723243 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8b077235-88b5-477f-b8ec-a4237199000b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.723512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723302 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.723512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723328 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q686z\" (UniqueName: \"kubernetes.io/projected/bc1a5e65-b225-4602-8640-f730e5adfa23-kube-api-access-q686z\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.723512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjkmj\" (UniqueName: \"kubernetes.io/projected/8b077235-88b5-477f-b8ec-a4237199000b-kube-api-access-mjkmj\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.723512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723377 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.723512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723433 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc1a5e65-b225-4602-8640-f730e5adfa23-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.723512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723457 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.723512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723482 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.723927 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.723534 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.824703 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.824665 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.824901 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.824715 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.824901 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.824743 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8b077235-88b5-477f-b8ec-a4237199000b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.824901 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.824774 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.824901 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.824803 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q686z\" (UniqueName: \"kubernetes.io/projected/bc1a5e65-b225-4602-8640-f730e5adfa23-kube-api-access-q686z\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.824901 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.824831 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjkmj\" (UniqueName: \"kubernetes.io/projected/8b077235-88b5-477f-b8ec-a4237199000b-kube-api-access-mjkmj\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.825218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.825218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825116 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.825218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825130 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc1a5e65-b225-4602-8640-f730e5adfa23-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.825218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825159 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.825218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825190 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.825218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825209 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.825588 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825224 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.825588 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825385 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.825588 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825420 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.825762 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825554 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.825762 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.825700 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.826073 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.826046 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.827891 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.827860 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8b077235-88b5-477f-b8ec-a4237199000b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.827984 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.827927 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.828160 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.828135 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc1a5e65-b225-4602-8640-f730e5adfa23-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.828301 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.828195 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.833041 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.833011 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjkmj\" (UniqueName: \"kubernetes.io/projected/8b077235-88b5-477f-b8ec-a4237199000b-kube-api-access-mjkmj\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.833041 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.833032 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q686z\" (UniqueName: \"kubernetes.io/projected/bc1a5e65-b225-4602-8640-f730e5adfa23-kube-api-access-q686z\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:46.934033 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.933949 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:46.975758 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:46.974850 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:47.091703 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:20:47.091667 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b077235_88b5_477f_b8ec_a4237199000b.slice/crio-cbc5f6f922071c91397c8d971d7960be46a1854795d8e5b05ac6772192590401 WatchSource:0}: Error finding container cbc5f6f922071c91397c8d971d7960be46a1854795d8e5b05ac6772192590401: Status 404 returned error can't find the container with id cbc5f6f922071c91397c8d971d7960be46a1854795d8e5b05ac6772192590401 Apr 16 14:20:47.092218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:47.092191 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6"] Apr 16 14:20:47.136804 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:47.136780 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz"] Apr 16 14:20:47.139379 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:20:47.139350 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1a5e65_b225_4602_8640_f730e5adfa23.slice/crio-6ae543c4a1c4ea910bc4d2bc79a7080fdd28852b6592b755c50fd24f89271841 WatchSource:0}: Error finding container 6ae543c4a1c4ea910bc4d2bc79a7080fdd28852b6592b755c50fd24f89271841: Status 404 returned error can't find the container with id 6ae543c4a1c4ea910bc4d2bc79a7080fdd28852b6592b755c50fd24f89271841 Apr 16 14:20:47.165346 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:47.165299 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" event={"ID":"8b077235-88b5-477f-b8ec-a4237199000b","Type":"ContainerStarted","Data":"cbc5f6f922071c91397c8d971d7960be46a1854795d8e5b05ac6772192590401"} Apr 16 14:20:47.166486 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:47.166457 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" event={"ID":"bc1a5e65-b225-4602-8640-f730e5adfa23","Type":"ContainerStarted","Data":"6ae543c4a1c4ea910bc4d2bc79a7080fdd28852b6592b755c50fd24f89271841"} Apr 16 14:20:48.172692 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:48.172651 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" event={"ID":"8b077235-88b5-477f-b8ec-a4237199000b","Type":"ContainerStarted","Data":"4656ad66ec5d1bfe5d3ed464c6454715ab09483e8ea6a855673ec548fab9c349"} Apr 16 14:20:48.173160 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:48.172784 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:48.174239 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:48.174214 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" event={"ID":"bc1a5e65-b225-4602-8640-f730e5adfa23","Type":"ContainerStarted","Data":"dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f"} Apr 16 14:20:49.182639 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:49.182587 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" event={"ID":"8b077235-88b5-477f-b8ec-a4237199000b","Type":"ContainerStarted","Data":"07df028b4e334787098882b214bfa432788f95d375aff024daf7984ba65b1ed5"} Apr 16 14:20:50.403527 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:50.403475 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 14:20:52.203099 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:52.203062 2580 generic.go:358] "Generic (PLEG): container finished" podID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerID="dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f" exitCode=0 Apr 16 14:20:52.203508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:52.203155 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" event={"ID":"bc1a5e65-b225-4602-8640-f730e5adfa23","Type":"ContainerDied","Data":"dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f"} Apr 16 14:20:53.209806 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:53.209765 2580 generic.go:358] "Generic (PLEG): container finished" podID="8b077235-88b5-477f-b8ec-a4237199000b" containerID="07df028b4e334787098882b214bfa432788f95d375aff024daf7984ba65b1ed5" exitCode=0 Apr 16 14:20:53.210315 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:53.209837 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" event={"ID":"8b077235-88b5-477f-b8ec-a4237199000b","Type":"ContainerDied","Data":"07df028b4e334787098882b214bfa432788f95d375aff024daf7984ba65b1ed5"} Apr 16 14:20:53.211800 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:53.211776 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" event={"ID":"bc1a5e65-b225-4602-8640-f730e5adfa23","Type":"ContainerStarted","Data":"82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17"} Apr 16 14:20:53.254482 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:53.254425 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podStartSLOduration=7.254405906 podStartE2EDuration="7.254405906s" podCreationTimestamp="2026-04-16 14:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:53.250498887 +0000 UTC m=+1276.686664617" watchObservedRunningTime="2026-04-16 14:20:53.254405906 +0000 UTC m=+1276.690571637" Apr 16 14:20:54.218977 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:54.218931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" event={"ID":"8b077235-88b5-477f-b8ec-a4237199000b","Type":"ContainerStarted","Data":"4cd3fb7222d6f0fe05d1e24254c73bd421dc5d31adf484b0c0d7c8757e554fa3"} Apr 16 14:20:54.244021 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:54.243796 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podStartSLOduration=7.277414415 podStartE2EDuration="8.243777882s" podCreationTimestamp="2026-04-16 14:20:46 +0000 UTC" firstStartedPulling="2026-04-16 14:20:47.094334996 +0000 UTC m=+1270.530500716" lastFinishedPulling="2026-04-16 14:20:48.060698474 +0000 UTC m=+1271.496864183" observedRunningTime="2026-04-16 14:20:54.241342479 +0000 UTC m=+1277.677508209" watchObservedRunningTime="2026-04-16 14:20:54.243777882 +0000 UTC m=+1277.679943610" Apr 16 14:20:56.934390 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:56.934342 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:56.934862 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:56.934414 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:20:56.936134 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:56.936094 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:20:56.975605 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:56.975567 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:56.975605 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:56.975613 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:20:56.977487 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:20:56.977450 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:21:00.403536 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:00.403470 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 14:21:06.934904 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:06.934851 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:21:06.976783 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:06.976734 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:21:07.250237 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.250199 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:21:07.603681 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.603654 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-575d96cb88-zfchz_03b5da37-52cc-4b28-aeda-ef529e65711d/main/0.log" Apr 16 14:21:07.604090 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.604071 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:21:07.735324 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.735214 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-kserve-provision-location\") pod \"03b5da37-52cc-4b28-aeda-ef529e65711d\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " Apr 16 14:21:07.735324 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.735309 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03b5da37-52cc-4b28-aeda-ef529e65711d-tls-certs\") pod \"03b5da37-52cc-4b28-aeda-ef529e65711d\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " Apr 16 14:21:07.735553 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.735379 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl7hw\" (UniqueName: \"kubernetes.io/projected/03b5da37-52cc-4b28-aeda-ef529e65711d-kube-api-access-fl7hw\") pod \"03b5da37-52cc-4b28-aeda-ef529e65711d\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " Apr 16 14:21:07.735553 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.735440 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-dshm\") pod \"03b5da37-52cc-4b28-aeda-ef529e65711d\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " Apr 16 14:21:07.735553 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.735473 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-model-cache\") pod \"03b5da37-52cc-4b28-aeda-ef529e65711d\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " Apr 16 14:21:07.735553 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.735501 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-home\") pod \"03b5da37-52cc-4b28-aeda-ef529e65711d\" (UID: \"03b5da37-52cc-4b28-aeda-ef529e65711d\") " Apr 16 14:21:07.736323 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.735952 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-model-cache" (OuterVolumeSpecName: "model-cache") pod "03b5da37-52cc-4b28-aeda-ef529e65711d" (UID: "03b5da37-52cc-4b28-aeda-ef529e65711d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:07.736323 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.736250 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-home" (OuterVolumeSpecName: "home") pod "03b5da37-52cc-4b28-aeda-ef529e65711d" (UID: "03b5da37-52cc-4b28-aeda-ef529e65711d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:07.738442 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.738408 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b5da37-52cc-4b28-aeda-ef529e65711d-kube-api-access-fl7hw" (OuterVolumeSpecName: "kube-api-access-fl7hw") pod "03b5da37-52cc-4b28-aeda-ef529e65711d" (UID: "03b5da37-52cc-4b28-aeda-ef529e65711d"). InnerVolumeSpecName "kube-api-access-fl7hw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:21:07.738544 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.738435 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-dshm" (OuterVolumeSpecName: "dshm") pod "03b5da37-52cc-4b28-aeda-ef529e65711d" (UID: "03b5da37-52cc-4b28-aeda-ef529e65711d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:07.738544 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.738491 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b5da37-52cc-4b28-aeda-ef529e65711d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "03b5da37-52cc-4b28-aeda-ef529e65711d" (UID: "03b5da37-52cc-4b28-aeda-ef529e65711d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:21:07.781904 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.781864 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "03b5da37-52cc-4b28-aeda-ef529e65711d" (UID: "03b5da37-52cc-4b28-aeda-ef529e65711d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:07.836459 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.836417 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03b5da37-52cc-4b28-aeda-ef529e65711d-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:21:07.836459 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.836454 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fl7hw\" (UniqueName: \"kubernetes.io/projected/03b5da37-52cc-4b28-aeda-ef529e65711d-kube-api-access-fl7hw\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:21:07.836459 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.836466 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:21:07.836774 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.836475 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:21:07.836774 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.836484 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:21:07.836774 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:07.836493 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03b5da37-52cc-4b28-aeda-ef529e65711d-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:21:08.288610 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.288577 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-575d96cb88-zfchz_03b5da37-52cc-4b28-aeda-ef529e65711d/main/0.log" Apr 16 14:21:08.289083 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.288998 2580 generic.go:358] "Generic (PLEG): container finished" podID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerID="c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f" exitCode=137 Apr 16 14:21:08.289147 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.289096 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" Apr 16 14:21:08.289191 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.289090 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" event={"ID":"03b5da37-52cc-4b28-aeda-ef529e65711d","Type":"ContainerDied","Data":"c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f"} Apr 16 14:21:08.289236 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.289218 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz" event={"ID":"03b5da37-52cc-4b28-aeda-ef529e65711d","Type":"ContainerDied","Data":"b2b547b2e2f11d6f439adb3efa8a245dc0c29eef2e5681368dc0618c4b5160b8"} Apr 16 14:21:08.289302 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.289242 2580 scope.go:117] "RemoveContainer" containerID="c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f" Apr 16 14:21:08.315175 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.315149 2580 scope.go:117] "RemoveContainer" containerID="6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333" Apr 16 14:21:08.320125 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.320062 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz"] Apr 16 14:21:08.324026 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.324000 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-575d96cb88-zfchz"] Apr 16 14:21:08.362818 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.362791 2580 scope.go:117] "RemoveContainer" containerID="c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f" Apr 16 14:21:08.363223 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:21:08.363199 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f\": container with ID starting with c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f not found: ID does not exist" containerID="c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f" Apr 16 14:21:08.363393 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.363231 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f"} err="failed to get container status \"c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f\": rpc error: code = NotFound desc = could not find container \"c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f\": container with ID starting with c75094b2ef589837f31c1d1d5fa372c8a904c08a8d75e21abb5d548aabc75d6f not found: ID does not exist" Apr 16 14:21:08.363393 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.363252 2580 scope.go:117] "RemoveContainer" containerID="6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333" Apr 16 14:21:08.363603 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:21:08.363578 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333\": container with ID starting with 6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333 not found: ID does not exist" containerID="6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333" Apr 16 14:21:08.363734 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:08.363609 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333"} err="failed to get container status \"6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333\": rpc error: code = NotFound desc = could not find container \"6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333\": container with ID starting with 6ba9ca6a151a21934d7bfde2ea674ef0aca7c78c0ea1d2194dc38d7eccd0e333 not found: ID does not exist" Apr 16 14:21:09.154254 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:09.154209 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" path="/var/lib/kubelet/pods/03b5da37-52cc-4b28-aeda-ef529e65711d/volumes" Apr 16 14:21:10.403805 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:10.403743 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 14:21:16.934995 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:16.934945 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:21:16.976446 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:16.976393 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:21:20.403490 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:20.403445 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 14:21:26.934788 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:26.934715 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:21:26.975952 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:26.975901 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:21:30.403670 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:30.403616 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 14:21:36.934832 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:36.934781 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:21:36.975771 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:36.975729 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:21:40.403917 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:40.403870 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 16 14:21:46.934891 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:46.934831 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:21:46.976443 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:46.976404 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:21:50.413525 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:50.413488 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:21:50.421420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:50.421391 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:21:56.935415 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:56.935305 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:21:56.975927 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:21:56.975892 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:22:02.592463 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:02.592428 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd"] Apr 16 14:22:02.592875 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:02.592750 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" containerID="cri-o://8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d" gracePeriod=30 Apr 16 14:22:06.935226 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:06.935166 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:22:06.976701 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:06.976661 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:22:14.495221 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.495179 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:22:14.496308 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.496252 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" Apr 16 14:22:14.496308 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.496308 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" Apr 16 14:22:14.496534 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.496325 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="storage-initializer" Apr 16 14:22:14.496534 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.496333 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="storage-initializer" Apr 16 14:22:14.496534 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.496477 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="03b5da37-52cc-4b28-aeda-ef529e65711d" containerName="main" Apr 16 14:22:14.501093 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.501072 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.505021 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.504997 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-x4fdp\"" Apr 16 14:22:14.505154 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.505001 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 14:22:14.514092 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.512253 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:22:14.570152 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.570114 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96c57d03-1532-408f-b76c-1e1d69d61c23-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.570324 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.570189 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7d8v\" (UniqueName: \"kubernetes.io/projected/96c57d03-1532-408f-b76c-1e1d69d61c23-kube-api-access-b7d8v\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.570324 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.570209 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.570324 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.570232 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.570435 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.570342 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.570435 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.570385 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.671631 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.671588 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.671821 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.671646 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96c57d03-1532-408f-b76c-1e1d69d61c23-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.671821 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.671694 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7d8v\" (UniqueName: \"kubernetes.io/projected/96c57d03-1532-408f-b76c-1e1d69d61c23-kube-api-access-b7d8v\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.671821 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.671718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.671821 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.671748 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.671821 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.671806 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.672116 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.672089 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.672188 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.672093 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.672251 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.672229 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.674226 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.674199 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.674361 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.674315 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96c57d03-1532-408f-b76c-1e1d69d61c23-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.679633 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.679605 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7d8v\" (UniqueName: \"kubernetes.io/projected/96c57d03-1532-408f-b76c-1e1d69d61c23-kube-api-access-b7d8v\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.817036 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.816942 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:14.958632 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.958596 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:22:14.962300 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:22:14.962254 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96c57d03_1532_408f_b76c_1e1d69d61c23.slice/crio-1d04ae9ea6cda030b9d6b02c4de3795dde8007af385678ce99101920734a7533 WatchSource:0}: Error finding container 1d04ae9ea6cda030b9d6b02c4de3795dde8007af385678ce99101920734a7533: Status 404 returned error can't find the container with id 1d04ae9ea6cda030b9d6b02c4de3795dde8007af385678ce99101920734a7533 Apr 16 14:22:14.964910 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:14.964892 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:22:15.600565 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:15.600531 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"96c57d03-1532-408f-b76c-1e1d69d61c23","Type":"ContainerStarted","Data":"02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255"} Apr 16 14:22:15.600565 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:15.600570 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"96c57d03-1532-408f-b76c-1e1d69d61c23","Type":"ContainerStarted","Data":"1d04ae9ea6cda030b9d6b02c4de3795dde8007af385678ce99101920734a7533"} Apr 16 14:22:16.934946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:16.934891 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:22:16.975891 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:16.975844 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:22:19.623564 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:19.623525 2580 generic.go:358] "Generic (PLEG): container finished" podID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerID="02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255" exitCode=0 Apr 16 14:22:19.623990 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:19.623611 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"96c57d03-1532-408f-b76c-1e1d69d61c23","Type":"ContainerDied","Data":"02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255"} Apr 16 14:22:20.630995 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:20.630959 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"96c57d03-1532-408f-b76c-1e1d69d61c23","Type":"ContainerStarted","Data":"e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299"} Apr 16 14:22:20.653472 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:20.653413 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.65339444 podStartE2EDuration="6.65339444s" podCreationTimestamp="2026-04-16 14:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:22:20.649003024 +0000 UTC m=+1364.085168753" watchObservedRunningTime="2026-04-16 14:22:20.65339444 +0000 UTC m=+1364.089560169" Apr 16 14:22:24.817115 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:24.817075 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:24.818912 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:24.818881 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:22:26.935073 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:26.935002 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:22:26.976167 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:26.976118 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:22:33.418752 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.418729 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd_5ce643ef-ffb2-4bae-8b9f-03a76bcddada/main/0.log" Apr 16 14:22:33.419188 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.419171 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:22:33.556054 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.555961 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgswc\" (UniqueName: \"kubernetes.io/projected/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kube-api-access-fgswc\") pod \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " Apr 16 14:22:33.556054 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.556018 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-dshm\") pod \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " Apr 16 14:22:33.556054 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.556054 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-home\") pod \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " Apr 16 14:22:33.556350 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.556122 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-tls-certs\") pod \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " Apr 16 14:22:33.556350 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.556170 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-model-cache\") pod \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " Apr 16 14:22:33.556350 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.556207 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kserve-provision-location\") pod \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\" (UID: \"5ce643ef-ffb2-4bae-8b9f-03a76bcddada\") " Apr 16 14:22:33.556513 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.556483 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-model-cache" (OuterVolumeSpecName: "model-cache") pod "5ce643ef-ffb2-4bae-8b9f-03a76bcddada" (UID: "5ce643ef-ffb2-4bae-8b9f-03a76bcddada"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:33.556928 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.556889 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-home" (OuterVolumeSpecName: "home") pod "5ce643ef-ffb2-4bae-8b9f-03a76bcddada" (UID: "5ce643ef-ffb2-4bae-8b9f-03a76bcddada"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:33.556928 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.556914 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:22:33.558955 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.558916 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-dshm" (OuterVolumeSpecName: "dshm") pod "5ce643ef-ffb2-4bae-8b9f-03a76bcddada" (UID: "5ce643ef-ffb2-4bae-8b9f-03a76bcddada"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:33.559089 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.559053 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kube-api-access-fgswc" (OuterVolumeSpecName: "kube-api-access-fgswc") pod "5ce643ef-ffb2-4bae-8b9f-03a76bcddada" (UID: "5ce643ef-ffb2-4bae-8b9f-03a76bcddada"). InnerVolumeSpecName "kube-api-access-fgswc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:22:33.559211 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.559189 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5ce643ef-ffb2-4bae-8b9f-03a76bcddada" (UID: "5ce643ef-ffb2-4bae-8b9f-03a76bcddada"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:22:33.586597 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.586556 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5ce643ef-ffb2-4bae-8b9f-03a76bcddada" (UID: "5ce643ef-ffb2-4bae-8b9f-03a76bcddada"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:33.658164 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.658122 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:22:33.658164 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.658165 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fgswc\" (UniqueName: \"kubernetes.io/projected/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-kube-api-access-fgswc\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:22:33.658389 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.658183 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:22:33.658389 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.658199 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:22:33.658389 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.658211 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ce643ef-ffb2-4bae-8b9f-03a76bcddada-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:22:33.690284 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.690248 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd_5ce643ef-ffb2-4bae-8b9f-03a76bcddada/main/0.log" Apr 16 14:22:33.690709 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.690679 2580 generic.go:358] "Generic (PLEG): container finished" podID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerID="8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d" exitCode=137 Apr 16 14:22:33.690855 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.690780 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" event={"ID":"5ce643ef-ffb2-4bae-8b9f-03a76bcddada","Type":"ContainerDied","Data":"8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d"} Apr 16 14:22:33.690855 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.690800 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" Apr 16 14:22:33.690855 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.690832 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd" event={"ID":"5ce643ef-ffb2-4bae-8b9f-03a76bcddada","Type":"ContainerDied","Data":"fcf5819b3fcc3d82ae0bf9a14ece382ddbcc1e6f78d1b7365f4da55e323f4e6e"} Apr 16 14:22:33.690855 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.690854 2580 scope.go:117] "RemoveContainer" containerID="8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d" Apr 16 14:22:33.715877 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.715838 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd"] Apr 16 14:22:33.719284 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.719241 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-8544f756bbgbkwd"] Apr 16 14:22:33.721345 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.721326 2580 scope.go:117] "RemoveContainer" containerID="10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b" Apr 16 14:22:33.757945 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.757925 2580 scope.go:117] "RemoveContainer" containerID="8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d" Apr 16 14:22:33.758308 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:22:33.758258 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d\": container with ID starting with 8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d not found: ID does not exist" containerID="8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d" Apr 16 14:22:33.758413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.758319 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d"} err="failed to get container status \"8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d\": rpc error: code = NotFound desc = could not find container \"8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d\": container with ID starting with 8e764cfe6e761b131784422b1eaef736964abc607ca87255f89ee167e1ef3c6d not found: ID does not exist" Apr 16 14:22:33.758413 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.758347 2580 scope.go:117] "RemoveContainer" containerID="10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b" Apr 16 14:22:33.758667 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:22:33.758644 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b\": container with ID starting with 10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b not found: ID does not exist" containerID="10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b" Apr 16 14:22:33.758764 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:33.758671 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b"} err="failed to get container status \"10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b\": rpc error: code = NotFound desc = could not find container \"10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b\": container with ID starting with 10c631fc2cacb8d5bb6f4da0a1492e0056dd4c908a56dbcdff16c3be9c7f7b8b not found: ID does not exist" Apr 16 14:22:34.817971 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:34.817924 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:22:35.153379 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:35.153340 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" path="/var/lib/kubelet/pods/5ce643ef-ffb2-4bae-8b9f-03a76bcddada/volumes" Apr 16 14:22:36.934501 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:36.934454 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:22:36.976523 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:36.976475 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:22:44.817981 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:44.817938 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:22:44.818416 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:44.818261 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:22:46.934669 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:46.934619 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:22:46.975794 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:46.975749 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:22:54.817809 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:54.817761 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:22:56.935344 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:56.935291 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:22:56.976199 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:22:56.976148 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:23:04.817608 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:04.817563 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:23:06.934866 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:06.934818 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:23:06.976759 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:06.976703 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:23:14.818114 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:14.818069 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:23:16.935014 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:16.934957 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:23:16.976048 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:16.976010 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:23:24.818477 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:24.818378 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:23:26.934358 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:26.934309 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8001/health\": dial tcp 10.134.0.54:8001: connect: connection refused" Apr 16 14:23:26.976501 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:26.976443 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:23:34.818184 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:34.818136 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:23:36.944639 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:36.944607 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:23:36.961257 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:36.961232 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:23:36.986054 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:36.986019 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:23:36.995020 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:36.994977 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:23:44.817831 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:44.817780 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:23:54.817823 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:54.817757 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:23:56.828607 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:56.828558 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz"] Apr 16 14:23:56.829127 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:56.828968 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" containerID="cri-o://82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17" gracePeriod=30 Apr 16 14:23:56.837730 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:56.837697 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6"] Apr 16 14:23:56.838212 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:23:56.838156 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" containerID="cri-o://4cd3fb7222d6f0fe05d1e24254c73bd421dc5d31adf484b0c0d7c8757e554fa3" gracePeriod=30 Apr 16 14:24:01.440356 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.440313 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n"] Apr 16 14:24:01.440803 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.440738 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" Apr 16 14:24:01.440803 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.440751 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" Apr 16 14:24:01.440803 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.440782 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="storage-initializer" Apr 16 14:24:01.440803 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.440788 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="storage-initializer" Apr 16 14:24:01.440935 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.440856 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ce643ef-ffb2-4bae-8b9f-03a76bcddada" containerName="main" Apr 16 14:24:01.445434 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.445415 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.447826 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.447800 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 14:24:01.458042 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.458018 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n"] Apr 16 14:24:01.573554 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.573519 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.573755 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.573591 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.573755 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.573646 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.573755 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.573671 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2dee4b83-5bd8-4413-bb33-c80cd4852d01-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.573755 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.573695 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.573755 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.573744 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dz5k\" (UniqueName: \"kubernetes.io/projected/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kube-api-access-8dz5k\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.674978 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.674935 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.675164 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.674997 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.675164 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.675052 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.675164 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.675079 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2dee4b83-5bd8-4413-bb33-c80cd4852d01-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.675415 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.675198 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.675415 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.675238 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dz5k\" (UniqueName: \"kubernetes.io/projected/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kube-api-access-8dz5k\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.675524 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.675427 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.675524 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.675472 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.675524 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.675509 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.677577 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.677549 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.677690 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.677652 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2dee4b83-5bd8-4413-bb33-c80cd4852d01-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.689943 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.689909 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dz5k\" (UniqueName: \"kubernetes.io/projected/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kube-api-access-8dz5k\") pod \"custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.757840 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.757749 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:01.893442 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:01.893400 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n"] Apr 16 14:24:01.895368 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:24:01.895330 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dee4b83_5bd8_4413_bb33_c80cd4852d01.slice/crio-94d7d6b787f91923aaa36d170cbc69d711ec8ccde393b9b2ca988e0ca27bcb65 WatchSource:0}: Error finding container 94d7d6b787f91923aaa36d170cbc69d711ec8ccde393b9b2ca988e0ca27bcb65: Status 404 returned error can't find the container with id 94d7d6b787f91923aaa36d170cbc69d711ec8ccde393b9b2ca988e0ca27bcb65 Apr 16 14:24:02.068074 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:02.067965 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" event={"ID":"2dee4b83-5bd8-4413-bb33-c80cd4852d01","Type":"ContainerStarted","Data":"7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d"} Apr 16 14:24:02.068074 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:02.068019 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" event={"ID":"2dee4b83-5bd8-4413-bb33-c80cd4852d01","Type":"ContainerStarted","Data":"94d7d6b787f91923aaa36d170cbc69d711ec8ccde393b9b2ca988e0ca27bcb65"} Apr 16 14:24:04.817436 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:04.817392 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:24:07.091189 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:07.091144 2580 generic.go:358] "Generic (PLEG): container finished" podID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerID="7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d" exitCode=0 Apr 16 14:24:07.091715 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:07.091225 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" event={"ID":"2dee4b83-5bd8-4413-bb33-c80cd4852d01","Type":"ContainerDied","Data":"7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d"} Apr 16 14:24:08.097342 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:08.097285 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" event={"ID":"2dee4b83-5bd8-4413-bb33-c80cd4852d01","Type":"ContainerStarted","Data":"8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd"} Apr 16 14:24:08.122335 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:08.122241 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podStartSLOduration=7.122222772 podStartE2EDuration="7.122222772s" podCreationTimestamp="2026-04-16 14:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:24:08.117741604 +0000 UTC m=+1471.553907331" watchObservedRunningTime="2026-04-16 14:24:08.122222772 +0000 UTC m=+1471.558388501" Apr 16 14:24:11.758610 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:11.758561 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:11.758610 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:11.758614 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:24:11.760649 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:11.760611 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 14:24:14.817985 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:14.817935 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.56:8000/health\": dial tcp 10.134.0.56:8000: connect: connection refused" Apr 16 14:24:21.758481 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:21.758438 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 14:24:24.827126 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:24.827095 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:24:24.834961 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:24.834921 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:24:26.838454 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:26.838387 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="llm-d-routing-sidecar" containerID="cri-o://4656ad66ec5d1bfe5d3ed464c6454715ab09483e8ea6a855673ec548fab9c349" gracePeriod=2 Apr 16 14:24:26.935068 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:26.935028 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 14:24:26.946210 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:26.946172 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 16 14:24:26.987127 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:26.987069 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" probeResult="failure" output="Get \"https://10.134.0.55:8000/health\": dial tcp 10.134.0.55:8000: connect: connection refused" Apr 16 14:24:27.128557 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.128533 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:24:27.181929 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.181900 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6_8b077235-88b5-477f-b8ec-a4237199000b/main/0.log" Apr 16 14:24:27.182572 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.182543 2580 generic.go:358] "Generic (PLEG): container finished" podID="8b077235-88b5-477f-b8ec-a4237199000b" containerID="4cd3fb7222d6f0fe05d1e24254c73bd421dc5d31adf484b0c0d7c8757e554fa3" exitCode=137 Apr 16 14:24:27.182572 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.182569 2580 generic.go:358] "Generic (PLEG): container finished" podID="8b077235-88b5-477f-b8ec-a4237199000b" containerID="4656ad66ec5d1bfe5d3ed464c6454715ab09483e8ea6a855673ec548fab9c349" exitCode=0 Apr 16 14:24:27.182771 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.182577 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" event={"ID":"8b077235-88b5-477f-b8ec-a4237199000b","Type":"ContainerDied","Data":"4cd3fb7222d6f0fe05d1e24254c73bd421dc5d31adf484b0c0d7c8757e554fa3"} Apr 16 14:24:27.182771 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.182628 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" event={"ID":"8b077235-88b5-477f-b8ec-a4237199000b","Type":"ContainerDied","Data":"4656ad66ec5d1bfe5d3ed464c6454715ab09483e8ea6a855673ec548fab9c349"} Apr 16 14:24:27.184364 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.184342 2580 generic.go:358] "Generic (PLEG): container finished" podID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerID="82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17" exitCode=137 Apr 16 14:24:27.184493 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.184455 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" Apr 16 14:24:27.184493 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.184470 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" event={"ID":"bc1a5e65-b225-4602-8640-f730e5adfa23","Type":"ContainerDied","Data":"82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17"} Apr 16 14:24:27.184590 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.184504 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz" event={"ID":"bc1a5e65-b225-4602-8640-f730e5adfa23","Type":"ContainerDied","Data":"6ae543c4a1c4ea910bc4d2bc79a7080fdd28852b6592b755c50fd24f89271841"} Apr 16 14:24:27.184590 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.184522 2580 scope.go:117] "RemoveContainer" containerID="82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17" Apr 16 14:24:27.210175 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.210143 2580 scope.go:117] "RemoveContainer" containerID="dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f" Apr 16 14:24:27.221350 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.221319 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-home\") pod \"bc1a5e65-b225-4602-8640-f730e5adfa23\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " Apr 16 14:24:27.221512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.221369 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-model-cache\") pod \"bc1a5e65-b225-4602-8640-f730e5adfa23\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " Apr 16 14:24:27.221512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.221398 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc1a5e65-b225-4602-8640-f730e5adfa23-tls-certs\") pod \"bc1a5e65-b225-4602-8640-f730e5adfa23\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " Apr 16 14:24:27.221512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.221422 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-dshm\") pod \"bc1a5e65-b225-4602-8640-f730e5adfa23\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " Apr 16 14:24:27.221512 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.221481 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q686z\" (UniqueName: \"kubernetes.io/projected/bc1a5e65-b225-4602-8640-f730e5adfa23-kube-api-access-q686z\") pod \"bc1a5e65-b225-4602-8640-f730e5adfa23\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " Apr 16 14:24:27.221740 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.221520 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-kserve-provision-location\") pod \"bc1a5e65-b225-4602-8640-f730e5adfa23\" (UID: \"bc1a5e65-b225-4602-8640-f730e5adfa23\") " Apr 16 14:24:27.221740 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.221687 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-model-cache" (OuterVolumeSpecName: "model-cache") pod "bc1a5e65-b225-4602-8640-f730e5adfa23" (UID: "bc1a5e65-b225-4602-8640-f730e5adfa23"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:27.221740 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.221704 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-home" (OuterVolumeSpecName: "home") pod "bc1a5e65-b225-4602-8640-f730e5adfa23" (UID: "bc1a5e65-b225-4602-8640-f730e5adfa23"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:27.221894 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.221795 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.221894 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.221814 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.225201 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.225069 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-dshm" (OuterVolumeSpecName: "dshm") pod "bc1a5e65-b225-4602-8640-f730e5adfa23" (UID: "bc1a5e65-b225-4602-8640-f730e5adfa23"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:27.225201 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.225096 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1a5e65-b225-4602-8640-f730e5adfa23-kube-api-access-q686z" (OuterVolumeSpecName: "kube-api-access-q686z") pod "bc1a5e65-b225-4602-8640-f730e5adfa23" (UID: "bc1a5e65-b225-4602-8640-f730e5adfa23"). InnerVolumeSpecName "kube-api-access-q686z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:24:27.225201 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.225175 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1a5e65-b225-4602-8640-f730e5adfa23-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bc1a5e65-b225-4602-8640-f730e5adfa23" (UID: "bc1a5e65-b225-4602-8640-f730e5adfa23"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:24:27.267342 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.267302 2580 scope.go:117] "RemoveContainer" containerID="82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17" Apr 16 14:24:27.267727 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:24:27.267682 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17\": container with ID starting with 82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17 not found: ID does not exist" containerID="82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17" Apr 16 14:24:27.267844 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.267735 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17"} err="failed to get container status \"82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17\": rpc error: code = NotFound desc = could not find container \"82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17\": container with ID starting with 82e408eb5828bfa87f8db849f48e9ae5602fb742df754635a99a14efc69c4e17 not found: ID does not exist" Apr 16 14:24:27.267844 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.267761 2580 scope.go:117] "RemoveContainer" containerID="dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f" Apr 16 14:24:27.268072 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:24:27.268050 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f\": container with ID starting with dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f not found: ID does not exist" containerID="dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f" Apr 16 14:24:27.268146 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.268086 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f"} err="failed to get container status \"dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f\": rpc error: code = NotFound desc = could not find container \"dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f\": container with ID starting with dd2862d48adf40136d459b440b3e051cc4ddf3f04f49ba685b91d679380cbc8f not found: ID does not exist" Apr 16 14:24:27.270677 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.270646 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc1a5e65-b225-4602-8640-f730e5adfa23" (UID: "bc1a5e65-b225-4602-8640-f730e5adfa23"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:27.275459 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.275440 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6_8b077235-88b5-477f-b8ec-a4237199000b/main/0.log" Apr 16 14:24:27.276120 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.276104 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:24:27.323391 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.323356 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc1a5e65-b225-4602-8640-f730e5adfa23-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.323391 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.323390 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.323391 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.323403 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q686z\" (UniqueName: \"kubernetes.io/projected/bc1a5e65-b225-4602-8640-f730e5adfa23-kube-api-access-q686z\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.323707 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.323417 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc1a5e65-b225-4602-8640-f730e5adfa23-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.423890 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.423852 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-kserve-provision-location\") pod \"8b077235-88b5-477f-b8ec-a4237199000b\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " Apr 16 14:24:27.424088 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.423926 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-dshm\") pod \"8b077235-88b5-477f-b8ec-a4237199000b\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " Apr 16 14:24:27.424088 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.423983 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjkmj\" (UniqueName: \"kubernetes.io/projected/8b077235-88b5-477f-b8ec-a4237199000b-kube-api-access-mjkmj\") pod \"8b077235-88b5-477f-b8ec-a4237199000b\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " Apr 16 14:24:27.424088 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.424021 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-model-cache\") pod \"8b077235-88b5-477f-b8ec-a4237199000b\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " Apr 16 14:24:27.424292 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.424086 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-home\") pod \"8b077235-88b5-477f-b8ec-a4237199000b\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " Apr 16 14:24:27.424292 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.424129 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8b077235-88b5-477f-b8ec-a4237199000b-tls-certs\") pod \"8b077235-88b5-477f-b8ec-a4237199000b\" (UID: \"8b077235-88b5-477f-b8ec-a4237199000b\") " Apr 16 14:24:27.424845 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.424630 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-home" (OuterVolumeSpecName: "home") pod "8b077235-88b5-477f-b8ec-a4237199000b" (UID: "8b077235-88b5-477f-b8ec-a4237199000b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:27.425079 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.425052 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-model-cache" (OuterVolumeSpecName: "model-cache") pod "8b077235-88b5-477f-b8ec-a4237199000b" (UID: "8b077235-88b5-477f-b8ec-a4237199000b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:27.426380 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.426355 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-dshm" (OuterVolumeSpecName: "dshm") pod "8b077235-88b5-477f-b8ec-a4237199000b" (UID: "8b077235-88b5-477f-b8ec-a4237199000b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:27.426677 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.426661 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b077235-88b5-477f-b8ec-a4237199000b-kube-api-access-mjkmj" (OuterVolumeSpecName: "kube-api-access-mjkmj") pod "8b077235-88b5-477f-b8ec-a4237199000b" (UID: "8b077235-88b5-477f-b8ec-a4237199000b"). InnerVolumeSpecName "kube-api-access-mjkmj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:24:27.426840 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.426827 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b077235-88b5-477f-b8ec-a4237199000b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8b077235-88b5-477f-b8ec-a4237199000b" (UID: "8b077235-88b5-477f-b8ec-a4237199000b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:24:27.435384 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.435355 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8b077235-88b5-477f-b8ec-a4237199000b" (UID: "8b077235-88b5-477f-b8ec-a4237199000b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:27.507890 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.507860 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz"] Apr 16 14:24:27.514331 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.514304 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-86mzbqz"] Apr 16 14:24:27.525286 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.525252 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mjkmj\" (UniqueName: \"kubernetes.io/projected/8b077235-88b5-477f-b8ec-a4237199000b-kube-api-access-mjkmj\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.525412 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.525291 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.525412 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.525306 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.525412 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.525318 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8b077235-88b5-477f-b8ec-a4237199000b-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.525412 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.525330 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:27.525412 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:27.525344 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8b077235-88b5-477f-b8ec-a4237199000b-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:28.190535 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:28.190448 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6_8b077235-88b5-477f-b8ec-a4237199000b/main/0.log" Apr 16 14:24:28.191203 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:28.191179 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" event={"ID":"8b077235-88b5-477f-b8ec-a4237199000b","Type":"ContainerDied","Data":"cbc5f6f922071c91397c8d971d7960be46a1854795d8e5b05ac6772192590401"} Apr 16 14:24:28.191318 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:28.191225 2580 scope.go:117] "RemoveContainer" containerID="4cd3fb7222d6f0fe05d1e24254c73bd421dc5d31adf484b0c0d7c8757e554fa3" Apr 16 14:24:28.191318 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:28.191225 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" Apr 16 14:24:28.219716 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:28.219683 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6"] Apr 16 14:24:28.220934 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:28.220912 2580 scope.go:117] "RemoveContainer" containerID="07df028b4e334787098882b214bfa432788f95d375aff024daf7984ba65b1ed5" Apr 16 14:24:28.224302 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:28.224261 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6"] Apr 16 14:24:28.242609 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:28.242586 2580 scope.go:117] "RemoveContainer" containerID="4656ad66ec5d1bfe5d3ed464c6454715ab09483e8ea6a855673ec548fab9c349" Apr 16 14:24:29.153916 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:29.153883 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b077235-88b5-477f-b8ec-a4237199000b" path="/var/lib/kubelet/pods/8b077235-88b5-477f-b8ec-a4237199000b/volumes" Apr 16 14:24:29.154464 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:29.154444 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" path="/var/lib/kubelet/pods/bc1a5e65-b225-4602-8640-f730e5adfa23/volumes" Apr 16 14:24:31.758566 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:31.758519 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 14:24:32.233097 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:32.233045 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-75b6667c779shh6" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": context deadline exceeded" Apr 16 14:24:37.175794 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:37.175767 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:24:37.179114 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:37.179092 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:24:41.758532 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:41.758486 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 14:24:47.533773 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:47.533721 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:24:47.534184 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:47.534033 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" containerID="cri-o://e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299" gracePeriod=30 Apr 16 14:24:48.481921 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.481898 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:24:48.520096 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.520002 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-dshm\") pod \"96c57d03-1532-408f-b76c-1e1d69d61c23\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " Apr 16 14:24:48.520096 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.520060 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-model-cache\") pod \"96c57d03-1532-408f-b76c-1e1d69d61c23\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " Apr 16 14:24:48.520353 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.520113 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-kserve-provision-location\") pod \"96c57d03-1532-408f-b76c-1e1d69d61c23\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " Apr 16 14:24:48.520353 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.520151 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7d8v\" (UniqueName: \"kubernetes.io/projected/96c57d03-1532-408f-b76c-1e1d69d61c23-kube-api-access-b7d8v\") pod \"96c57d03-1532-408f-b76c-1e1d69d61c23\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " Apr 16 14:24:48.520353 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.520180 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-home\") pod \"96c57d03-1532-408f-b76c-1e1d69d61c23\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " Apr 16 14:24:48.520353 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.520215 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96c57d03-1532-408f-b76c-1e1d69d61c23-tls-certs\") pod \"96c57d03-1532-408f-b76c-1e1d69d61c23\" (UID: \"96c57d03-1532-408f-b76c-1e1d69d61c23\") " Apr 16 14:24:48.520353 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.520341 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-model-cache" (OuterVolumeSpecName: "model-cache") pod "96c57d03-1532-408f-b76c-1e1d69d61c23" (UID: "96c57d03-1532-408f-b76c-1e1d69d61c23"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:48.520627 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.520528 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:48.521020 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.520977 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-home" (OuterVolumeSpecName: "home") pod "96c57d03-1532-408f-b76c-1e1d69d61c23" (UID: "96c57d03-1532-408f-b76c-1e1d69d61c23"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:48.522661 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.522607 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-dshm" (OuterVolumeSpecName: "dshm") pod "96c57d03-1532-408f-b76c-1e1d69d61c23" (UID: "96c57d03-1532-408f-b76c-1e1d69d61c23"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:48.523214 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.523184 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c57d03-1532-408f-b76c-1e1d69d61c23-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "96c57d03-1532-408f-b76c-1e1d69d61c23" (UID: "96c57d03-1532-408f-b76c-1e1d69d61c23"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:24:48.523214 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.523199 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c57d03-1532-408f-b76c-1e1d69d61c23-kube-api-access-b7d8v" (OuterVolumeSpecName: "kube-api-access-b7d8v") pod "96c57d03-1532-408f-b76c-1e1d69d61c23" (UID: "96c57d03-1532-408f-b76c-1e1d69d61c23"). InnerVolumeSpecName "kube-api-access-b7d8v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:24:48.563743 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.563691 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "96c57d03-1532-408f-b76c-1e1d69d61c23" (UID: "96c57d03-1532-408f-b76c-1e1d69d61c23"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:24:48.621210 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.621165 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:48.621210 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.621195 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:48.621210 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.621206 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b7d8v\" (UniqueName: \"kubernetes.io/projected/96c57d03-1532-408f-b76c-1e1d69d61c23-kube-api-access-b7d8v\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:48.621210 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.621214 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96c57d03-1532-408f-b76c-1e1d69d61c23-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:48.621210 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:48.621223 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96c57d03-1532-408f-b76c-1e1d69d61c23-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:24:49.283583 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.283541 2580 generic.go:358] "Generic (PLEG): container finished" podID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerID="e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299" exitCode=0 Apr 16 14:24:49.283775 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.283616 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:24:49.283775 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.283629 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"96c57d03-1532-408f-b76c-1e1d69d61c23","Type":"ContainerDied","Data":"e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299"} Apr 16 14:24:49.283775 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.283679 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"96c57d03-1532-408f-b76c-1e1d69d61c23","Type":"ContainerDied","Data":"1d04ae9ea6cda030b9d6b02c4de3795dde8007af385678ce99101920734a7533"} Apr 16 14:24:49.283775 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.283696 2580 scope.go:117] "RemoveContainer" containerID="e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299" Apr 16 14:24:49.303571 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.303531 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:24:49.308095 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.308074 2580 scope.go:117] "RemoveContainer" containerID="02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255" Apr 16 14:24:49.308319 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.308296 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:24:49.351508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.351479 2580 scope.go:117] "RemoveContainer" containerID="e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299" Apr 16 14:24:49.351844 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:24:49.351818 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299\": container with ID starting with e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299 not found: ID does not exist" containerID="e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299" Apr 16 14:24:49.351920 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.351863 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299"} err="failed to get container status \"e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299\": rpc error: code = NotFound desc = could not find container \"e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299\": container with ID starting with e101dffb601c11b30af53057465327419472485fae79a10ab81051588395a299 not found: ID does not exist" Apr 16 14:24:49.351920 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.351891 2580 scope.go:117] "RemoveContainer" containerID="02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255" Apr 16 14:24:49.352219 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:24:49.352199 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255\": container with ID starting with 02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255 not found: ID does not exist" containerID="02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255" Apr 16 14:24:49.352296 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:49.352225 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255"} err="failed to get container status \"02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255\": rpc error: code = NotFound desc = could not find container \"02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255\": container with ID starting with 02c0bccb0777ea36368b74a85e536e9ed1d00a2f342e5f1af8d83a9ca4aa0255 not found: ID does not exist" Apr 16 14:24:51.154534 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:51.154491 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" path="/var/lib/kubelet/pods/96c57d03-1532-408f-b76c-1e1d69d61c23/volumes" Apr 16 14:24:51.758411 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:51.758365 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 14:24:52.505247 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505206 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx"] Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505622 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="storage-initializer" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505635 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="storage-initializer" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505649 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="storage-initializer" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505658 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="storage-initializer" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505673 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505678 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505686 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505691 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505700 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="llm-d-routing-sidecar" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505705 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="llm-d-routing-sidecar" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505712 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505717 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505727 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="storage-initializer" Apr 16 14:24:52.505746 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505737 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="storage-initializer" Apr 16 14:24:52.506500 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505792 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="main" Apr 16 14:24:52.506500 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505803 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b077235-88b5-477f-b8ec-a4237199000b" containerName="llm-d-routing-sidecar" Apr 16 14:24:52.506500 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505812 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc1a5e65-b225-4602-8640-f730e5adfa23" containerName="main" Apr 16 14:24:52.506500 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.505821 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="96c57d03-1532-408f-b76c-1e1d69d61c23" containerName="main" Apr 16 14:24:52.511143 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.511119 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.513906 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.513881 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 14:24:52.520475 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.520449 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx"] Apr 16 14:24:52.549309 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.549252 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.549309 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.549303 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlmqb\" (UniqueName: \"kubernetes.io/projected/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kube-api-access-jlmqb\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.549567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.549368 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-model-cache\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.549567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.549398 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-tls-certs\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.549567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.549419 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-dshm\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.549567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.549441 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-home\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.650109 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.650062 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-tls-certs\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.650428 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.650144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-dshm\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.650428 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.650182 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-home\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.650428 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.650223 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.650428 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.650358 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlmqb\" (UniqueName: \"kubernetes.io/projected/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kube-api-access-jlmqb\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.650683 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.650483 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-model-cache\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.650683 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.650670 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-home\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.650790 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.650706 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.650843 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.650816 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-model-cache\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.652618 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.652596 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-dshm\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.652867 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.652843 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-tls-certs\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.660223 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.660194 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlmqb\" (UniqueName: \"kubernetes.io/projected/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kube-api-access-jlmqb\") pod \"scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.823683 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.823585 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:24:52.960473 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:52.960448 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx"] Apr 16 14:24:52.963144 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:24:52.963118 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec4f773d_b3b2_4efa_8a6d_5541c6b964bd.slice/crio-522a77db85e0bed8592f1967ac41d738a34b2ee86c9ef1211ad02b1e8c541a2d WatchSource:0}: Error finding container 522a77db85e0bed8592f1967ac41d738a34b2ee86c9ef1211ad02b1e8c541a2d: Status 404 returned error can't find the container with id 522a77db85e0bed8592f1967ac41d738a34b2ee86c9ef1211ad02b1e8c541a2d Apr 16 14:24:53.301946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:53.301906 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" event={"ID":"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd","Type":"ContainerStarted","Data":"032d68799011d9c2a883316f3706b0fefd7955ffcef48114c0c95687eb2692bd"} Apr 16 14:24:53.301946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:53.301947 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" event={"ID":"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd","Type":"ContainerStarted","Data":"522a77db85e0bed8592f1967ac41d738a34b2ee86c9ef1211ad02b1e8c541a2d"} Apr 16 14:24:58.326020 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:58.325984 2580 generic.go:358] "Generic (PLEG): container finished" podID="ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" containerID="032d68799011d9c2a883316f3706b0fefd7955ffcef48114c0c95687eb2692bd" exitCode=0 Apr 16 14:24:58.326020 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:58.326022 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" event={"ID":"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd","Type":"ContainerDied","Data":"032d68799011d9c2a883316f3706b0fefd7955ffcef48114c0c95687eb2692bd"} Apr 16 14:24:59.331842 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:59.331807 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" event={"ID":"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd","Type":"ContainerStarted","Data":"f0d426c543466fb788f99608b5ad41cdc6dc5117595fefb3cb71c0b506b724e6"} Apr 16 14:24:59.352577 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:24:59.352526 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" podStartSLOduration=7.352510472 podStartE2EDuration="7.352510472s" podCreationTimestamp="2026-04-16 14:24:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:24:59.349775935 +0000 UTC m=+1522.785941663" watchObservedRunningTime="2026-04-16 14:24:59.352510472 +0000 UTC m=+1522.788676251" Apr 16 14:25:01.759284 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:01.759213 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 14:25:02.824377 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:02.824339 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:25:02.824377 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:02.824389 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:25:02.837348 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:02.837316 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:25:03.358077 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:03.358044 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:25:11.758444 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:11.758403 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 14:25:21.759119 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:21.759076 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 14:25:26.295902 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.294651 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx"] Apr 16 14:25:26.295902 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.295437 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" podUID="ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" containerName="main" containerID="cri-o://f0d426c543466fb788f99608b5ad41cdc6dc5117595fefb3cb71c0b506b724e6" gracePeriod=30 Apr 16 14:25:26.438772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.438735 2580 generic.go:358] "Generic (PLEG): container finished" podID="ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" containerID="f0d426c543466fb788f99608b5ad41cdc6dc5117595fefb3cb71c0b506b724e6" exitCode=0 Apr 16 14:25:26.438772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.438788 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" event={"ID":"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd","Type":"ContainerDied","Data":"f0d426c543466fb788f99608b5ad41cdc6dc5117595fefb3cb71c0b506b724e6"} Apr 16 14:25:26.558100 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.558023 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:25:26.650051 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.650003 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-dshm\") pod \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " Apr 16 14:25:26.650291 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.650090 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kserve-provision-location\") pod \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " Apr 16 14:25:26.650291 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.650144 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlmqb\" (UniqueName: \"kubernetes.io/projected/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kube-api-access-jlmqb\") pod \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " Apr 16 14:25:26.650291 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.650207 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-home\") pod \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " Apr 16 14:25:26.650291 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.650237 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-model-cache\") pod \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " Apr 16 14:25:26.650537 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.650444 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-tls-certs\") pod \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\" (UID: \"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd\") " Apr 16 14:25:26.650537 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.650511 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-home" (OuterVolumeSpecName: "home") pod "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" (UID: "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:25:26.650649 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.650530 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-model-cache" (OuterVolumeSpecName: "model-cache") pod "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" (UID: "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:25:26.650771 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.650751 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:25:26.650834 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.650780 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:25:26.652698 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.652661 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" (UID: "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:25:26.652911 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.652893 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-dshm" (OuterVolumeSpecName: "dshm") pod "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" (UID: "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:25:26.652981 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.652948 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kube-api-access-jlmqb" (OuterVolumeSpecName: "kube-api-access-jlmqb") pod "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" (UID: "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd"). InnerVolumeSpecName "kube-api-access-jlmqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:25:26.726173 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.726112 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" (UID: "ec4f773d-b3b2-4efa-8a6d-5541c6b964bd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:25:26.751725 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.751688 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:25:26.751725 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.751719 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:25:26.751725 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.751730 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jlmqb\" (UniqueName: \"kubernetes.io/projected/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-kube-api-access-jlmqb\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:25:26.751963 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:26.751740 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:25:27.443838 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:27.443751 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" event={"ID":"ec4f773d-b3b2-4efa-8a6d-5541c6b964bd","Type":"ContainerDied","Data":"522a77db85e0bed8592f1967ac41d738a34b2ee86c9ef1211ad02b1e8c541a2d"} Apr 16 14:25:27.443838 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:27.443772 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx" Apr 16 14:25:27.443838 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:27.443796 2580 scope.go:117] "RemoveContainer" containerID="f0d426c543466fb788f99608b5ad41cdc6dc5117595fefb3cb71c0b506b724e6" Apr 16 14:25:27.453125 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:27.453095 2580 scope.go:117] "RemoveContainer" containerID="032d68799011d9c2a883316f3706b0fefd7955ffcef48114c0c95687eb2692bd" Apr 16 14:25:27.462141 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:27.462113 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx"] Apr 16 14:25:27.464247 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:27.464226 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f8b7c69b9-8m8jx"] Apr 16 14:25:29.152612 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:29.152573 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" path="/var/lib/kubelet/pods/ec4f773d-b3b2-4efa-8a6d-5541c6b964bd/volumes" Apr 16 14:25:31.758944 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:31.758898 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 14:25:41.759012 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:41.758966 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" probeResult="failure" output="Get \"https://10.134.0.57:8000/health\": dial tcp 10.134.0.57:8000: connect: connection refused" Apr 16 14:25:51.768720 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:51.768687 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:25:51.776871 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:25:51.776846 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:26:04.368338 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.368296 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n"] Apr 16 14:26:04.368897 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.368626 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" containerID="cri-o://8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd" gracePeriod=30 Apr 16 14:26:04.387186 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.387151 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65"] Apr 16 14:26:04.387851 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.387825 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" containerName="storage-initializer" Apr 16 14:26:04.387851 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.387851 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" containerName="storage-initializer" Apr 16 14:26:04.388052 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.387871 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" containerName="main" Apr 16 14:26:04.388052 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.387878 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" containerName="main" Apr 16 14:26:04.388052 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.387968 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec4f773d-b3b2-4efa-8a6d-5541c6b964bd" containerName="main" Apr 16 14:26:04.391321 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.391296 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.400711 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.400676 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-7hltt\"" Apr 16 14:26:04.401835 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.401808 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 14:26:04.410516 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.410486 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65"] Apr 16 14:26:04.498917 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.498877 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.498917 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.498926 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ce99772a-304e-4452-bff7-1497e94435cd-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.499136 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.498978 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.499136 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.498995 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ce99772a-304e-4452-bff7-1497e94435cd-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.499136 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.499016 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xq6g\" (UniqueName: \"kubernetes.io/projected/ce99772a-304e-4452-bff7-1497e94435cd-kube-api-access-2xq6g\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.499136 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.499088 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.499136 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.499132 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.499336 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.499149 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ce99772a-304e-4452-bff7-1497e94435cd-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.499336 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.499218 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.599979 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.599942 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ce99772a-304e-4452-bff7-1497e94435cd-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600136 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.599995 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600136 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.600120 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ce99772a-304e-4452-bff7-1497e94435cd-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600309 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.600165 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xq6g\" (UniqueName: \"kubernetes.io/projected/ce99772a-304e-4452-bff7-1497e94435cd-kube-api-access-2xq6g\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600309 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.600201 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600309 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.600241 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600309 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.600293 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ce99772a-304e-4452-bff7-1497e94435cd-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600514 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.600371 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600514 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.600423 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600645 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.600618 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600771 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.600749 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.600888 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.600867 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.601026 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.601007 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.601067 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.601021 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ce99772a-304e-4452-bff7-1497e94435cd-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.602333 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.602311 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ce99772a-304e-4452-bff7-1497e94435cd-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.602765 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.602742 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ce99772a-304e-4452-bff7-1497e94435cd-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.608307 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.608286 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xq6g\" (UniqueName: \"kubernetes.io/projected/ce99772a-304e-4452-bff7-1497e94435cd-kube-api-access-2xq6g\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.608399 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.608293 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ce99772a-304e-4452-bff7-1497e94435cd-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-wbg65\" (UID: \"ce99772a-304e-4452-bff7-1497e94435cd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.710308 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.710211 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:04.846189 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:04.846159 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65"] Apr 16 14:26:04.847300 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:26:04.847255 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce99772a_304e_4452_bff7_1497e94435cd.slice/crio-6e0ea6ccc2b52499b3c3b87393de06209d43238147cda896ed9195fb583fa373 WatchSource:0}: Error finding container 6e0ea6ccc2b52499b3c3b87393de06209d43238147cda896ed9195fb583fa373: Status 404 returned error can't find the container with id 6e0ea6ccc2b52499b3c3b87393de06209d43238147cda896ed9195fb583fa373 Apr 16 14:26:05.603571 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:05.603533 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" event={"ID":"ce99772a-304e-4452-bff7-1497e94435cd","Type":"ContainerStarted","Data":"6e0ea6ccc2b52499b3c3b87393de06209d43238147cda896ed9195fb583fa373"} Apr 16 14:26:07.609491 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:07.609443 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 14:26:07.609832 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:07.609523 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 14:26:07.609832 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:07.609558 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 14:26:08.615772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:08.615738 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" event={"ID":"ce99772a-304e-4452-bff7-1497e94435cd","Type":"ContainerStarted","Data":"e208ba0f15f125b13fa8a881c60f8d75e75fd0187cfd4e60a3f79b99679f2146"} Apr 16 14:26:08.638704 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:08.638658 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" podStartSLOduration=1.8791099679999999 podStartE2EDuration="4.638643743s" podCreationTimestamp="2026-04-16 14:26:04 +0000 UTC" firstStartedPulling="2026-04-16 14:26:04.849637772 +0000 UTC m=+1588.285803478" lastFinishedPulling="2026-04-16 14:26:07.609171543 +0000 UTC m=+1591.045337253" observedRunningTime="2026-04-16 14:26:08.635799723 +0000 UTC m=+1592.071965451" watchObservedRunningTime="2026-04-16 14:26:08.638643743 +0000 UTC m=+1592.074809468" Apr 16 14:26:08.711175 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:08.711126 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:08.712712 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:08.712677 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" podUID="ce99772a-304e-4452-bff7-1497e94435cd" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.59:15021/healthz/ready\": dial tcp 10.134.0.59:15021: connect: connection refused" Apr 16 14:26:09.710695 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:09.710654 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" podUID="ce99772a-304e-4452-bff7-1497e94435cd" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.59:15021/healthz/ready\": dial tcp 10.134.0.59:15021: connect: connection refused" Apr 16 14:26:10.711483 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:10.711439 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" podUID="ce99772a-304e-4452-bff7-1497e94435cd" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.59:15021/healthz/ready\": dial tcp 10.134.0.59:15021: connect: connection refused" Apr 16 14:26:11.714735 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:11.714704 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:11.715232 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:11.715209 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:11.716039 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:11.716023 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-wbg65" Apr 16 14:26:27.421030 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.420942 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq"] Apr 16 14:26:27.430694 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.430675 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.433400 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.433372 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 14:26:27.433994 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.433976 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-gz97h\"" Apr 16 14:26:27.442733 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.442702 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9"] Apr 16 14:26:27.447125 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.447102 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq"] Apr 16 14:26:27.447247 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.447235 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.458867 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.458840 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9"] Apr 16 14:26:27.505558 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.505521 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-dshm\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.505763 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.505580 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-home\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.505763 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.505699 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9e845b36-1515-4741-8a77-993cd22eb17b-tls-certs\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.505763 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.505758 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwjtj\" (UniqueName: \"kubernetes.io/projected/9e845b36-1515-4741-8a77-993cd22eb17b-kube-api-access-xwjtj\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.505937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.505801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.505937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.505836 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-model-cache\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.607130 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607075 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.607130 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607141 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-dshm\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.607455 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607164 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/035ca082-6c53-4429-ba8f-c54e6cbfbc94-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.607455 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607201 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-home\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.607455 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607231 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9e845b36-1515-4741-8a77-993cd22eb17b-tls-certs\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.607455 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607258 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwjtj\" (UniqueName: \"kubernetes.io/projected/9e845b36-1515-4741-8a77-993cd22eb17b-kube-api-access-xwjtj\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.607455 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607310 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-home\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.607455 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607329 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.607455 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607356 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.607455 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607417 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.607816 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607495 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-model-cache\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.607816 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lqfs\" (UniqueName: \"kubernetes.io/projected/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kube-api-access-5lqfs\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.607816 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607665 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-home\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.607816 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607706 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.607950 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.607819 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-model-cache\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.609757 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.609737 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-dshm\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.609922 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.609906 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9e845b36-1515-4741-8a77-993cd22eb17b-tls-certs\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.616136 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.616111 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwjtj\" (UniqueName: \"kubernetes.io/projected/9e845b36-1515-4741-8a77-993cd22eb17b-kube-api-access-xwjtj\") pod \"router-with-refs-pd-test-kserve-559fccd7cb-kwfkq\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.709016 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.708927 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/035ca082-6c53-4429-ba8f-c54e6cbfbc94-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.709016 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.708994 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-home\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.709016 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.709010 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.709343 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.709030 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.709343 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.709063 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lqfs\" (UniqueName: \"kubernetes.io/projected/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kube-api-access-5lqfs\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.709343 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.709085 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.709527 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.709509 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.709567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.709506 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-home\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.709610 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.709594 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.711494 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.711469 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.711747 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.711728 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/035ca082-6c53-4429-ba8f-c54e6cbfbc94-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.716802 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.716781 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lqfs\" (UniqueName: \"kubernetes.io/projected/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kube-api-access-5lqfs\") pod \"router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.742670 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.742640 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:27.758204 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.758174 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:27.884805 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.884776 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq"] Apr 16 14:26:27.886734 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:26:27.886683 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e845b36_1515_4741_8a77_993cd22eb17b.slice/crio-d89814b19d4e17fab07d59d62dda3b9473c307b1cc9879f511b4b225245498b3 WatchSource:0}: Error finding container d89814b19d4e17fab07d59d62dda3b9473c307b1cc9879f511b4b225245498b3: Status 404 returned error can't find the container with id d89814b19d4e17fab07d59d62dda3b9473c307b1cc9879f511b4b225245498b3 Apr 16 14:26:27.915772 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:27.915746 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9"] Apr 16 14:26:27.918410 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:26:27.918383 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod035ca082_6c53_4429_ba8f_c54e6cbfbc94.slice/crio-0cd52135be85d473059db4028a36dd4c2bd449e962cf5127871e0bf98bbdeba5 WatchSource:0}: Error finding container 0cd52135be85d473059db4028a36dd4c2bd449e962cf5127871e0bf98bbdeba5: Status 404 returned error can't find the container with id 0cd52135be85d473059db4028a36dd4c2bd449e962cf5127871e0bf98bbdeba5 Apr 16 14:26:28.697745 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:28.697705 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" event={"ID":"035ca082-6c53-4429-ba8f-c54e6cbfbc94","Type":"ContainerStarted","Data":"26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008"} Apr 16 14:26:28.698182 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:28.697753 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" event={"ID":"035ca082-6c53-4429-ba8f-c54e6cbfbc94","Type":"ContainerStarted","Data":"0cd52135be85d473059db4028a36dd4c2bd449e962cf5127871e0bf98bbdeba5"} Apr 16 14:26:28.699069 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:28.699044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" event={"ID":"9e845b36-1515-4741-8a77-993cd22eb17b","Type":"ContainerStarted","Data":"c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452"} Apr 16 14:26:28.699159 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:28.699074 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" event={"ID":"9e845b36-1515-4741-8a77-993cd22eb17b","Type":"ContainerStarted","Data":"d89814b19d4e17fab07d59d62dda3b9473c307b1cc9879f511b4b225245498b3"} Apr 16 14:26:28.699218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:28.699159 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:29.705223 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:29.705154 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" event={"ID":"9e845b36-1515-4741-8a77-993cd22eb17b","Type":"ContainerStarted","Data":"36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80"} Apr 16 14:26:32.720828 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:32.720789 2580 generic.go:358] "Generic (PLEG): container finished" podID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerID="26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008" exitCode=0 Apr 16 14:26:32.721174 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:32.720871 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" event={"ID":"035ca082-6c53-4429-ba8f-c54e6cbfbc94","Type":"ContainerDied","Data":"26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008"} Apr 16 14:26:33.726523 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:33.726484 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" event={"ID":"035ca082-6c53-4429-ba8f-c54e6cbfbc94","Type":"ContainerStarted","Data":"8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9"} Apr 16 14:26:33.728161 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:33.728137 2580 generic.go:358] "Generic (PLEG): container finished" podID="9e845b36-1515-4741-8a77-993cd22eb17b" containerID="36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80" exitCode=0 Apr 16 14:26:33.728311 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:33.728188 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" event={"ID":"9e845b36-1515-4741-8a77-993cd22eb17b","Type":"ContainerDied","Data":"36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80"} Apr 16 14:26:33.749547 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:33.749496 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podStartSLOduration=6.749476487 podStartE2EDuration="6.749476487s" podCreationTimestamp="2026-04-16 14:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:26:33.747010557 +0000 UTC m=+1617.183176285" watchObservedRunningTime="2026-04-16 14:26:33.749476487 +0000 UTC m=+1617.185642216" Apr 16 14:26:34.685579 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.685557 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:26:34.735942 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.735909 2580 generic.go:358] "Generic (PLEG): container finished" podID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerID="8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd" exitCode=137 Apr 16 14:26:34.736406 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.736016 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" event={"ID":"2dee4b83-5bd8-4413-bb33-c80cd4852d01","Type":"ContainerDied","Data":"8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd"} Apr 16 14:26:34.736406 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.736047 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" event={"ID":"2dee4b83-5bd8-4413-bb33-c80cd4852d01","Type":"ContainerDied","Data":"94d7d6b787f91923aaa36d170cbc69d711ec8ccde393b9b2ca988e0ca27bcb65"} Apr 16 14:26:34.736406 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.736057 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n" Apr 16 14:26:34.736406 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.736090 2580 scope.go:117] "RemoveContainer" containerID="8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd" Apr 16 14:26:34.738903 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.738874 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" event={"ID":"9e845b36-1515-4741-8a77-993cd22eb17b","Type":"ContainerStarted","Data":"21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7"} Apr 16 14:26:34.763702 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.762436 2580 scope.go:117] "RemoveContainer" containerID="7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d" Apr 16 14:26:34.763930 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.763836 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podStartSLOduration=7.763818621 podStartE2EDuration="7.763818621s" podCreationTimestamp="2026-04-16 14:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:26:34.760847881 +0000 UTC m=+1618.197013610" watchObservedRunningTime="2026-04-16 14:26:34.763818621 +0000 UTC m=+1618.199984349" Apr 16 14:26:34.781092 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.781051 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dz5k\" (UniqueName: \"kubernetes.io/projected/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kube-api-access-8dz5k\") pod \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " Apr 16 14:26:34.781256 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.781110 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-dshm\") pod \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " Apr 16 14:26:34.781256 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.781145 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-home\") pod \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " Apr 16 14:26:34.781256 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.781180 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2dee4b83-5bd8-4413-bb33-c80cd4852d01-tls-certs\") pod \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " Apr 16 14:26:34.781256 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.781210 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kserve-provision-location\") pod \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " Apr 16 14:26:34.781879 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.781262 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-model-cache\") pod \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\" (UID: \"2dee4b83-5bd8-4413-bb33-c80cd4852d01\") " Apr 16 14:26:34.781879 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.781561 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-home" (OuterVolumeSpecName: "home") pod "2dee4b83-5bd8-4413-bb33-c80cd4852d01" (UID: "2dee4b83-5bd8-4413-bb33-c80cd4852d01"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:34.781879 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.781648 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-model-cache" (OuterVolumeSpecName: "model-cache") pod "2dee4b83-5bd8-4413-bb33-c80cd4852d01" (UID: "2dee4b83-5bd8-4413-bb33-c80cd4852d01"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:34.784328 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.784285 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-dshm" (OuterVolumeSpecName: "dshm") pod "2dee4b83-5bd8-4413-bb33-c80cd4852d01" (UID: "2dee4b83-5bd8-4413-bb33-c80cd4852d01"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:34.784328 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.784297 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kube-api-access-8dz5k" (OuterVolumeSpecName: "kube-api-access-8dz5k") pod "2dee4b83-5bd8-4413-bb33-c80cd4852d01" (UID: "2dee4b83-5bd8-4413-bb33-c80cd4852d01"). InnerVolumeSpecName "kube-api-access-8dz5k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:26:34.796104 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.796074 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dee4b83-5bd8-4413-bb33-c80cd4852d01-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2dee4b83-5bd8-4413-bb33-c80cd4852d01" (UID: "2dee4b83-5bd8-4413-bb33-c80cd4852d01"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:26:34.851570 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.851541 2580 scope.go:117] "RemoveContainer" containerID="8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd" Apr 16 14:26:34.851901 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:26:34.851872 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd\": container with ID starting with 8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd not found: ID does not exist" containerID="8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd" Apr 16 14:26:34.852028 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.851909 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd"} err="failed to get container status \"8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd\": rpc error: code = NotFound desc = could not find container \"8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd\": container with ID starting with 8f43751db07593481018d3c558e02c48c1b599d8cf5ebe67715b2b156e4b27fd not found: ID does not exist" Apr 16 14:26:34.852028 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.851935 2580 scope.go:117] "RemoveContainer" containerID="7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d" Apr 16 14:26:34.852238 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:26:34.852199 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d\": container with ID starting with 7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d not found: ID does not exist" containerID="7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d" Apr 16 14:26:34.852336 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.852247 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d"} err="failed to get container status \"7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d\": rpc error: code = NotFound desc = could not find container \"7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d\": container with ID starting with 7313d811c58921110a717ad3480e9fc50212c0ec4799e17c64659f3edb68a14d not found: ID does not exist" Apr 16 14:26:34.871981 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.871940 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2dee4b83-5bd8-4413-bb33-c80cd4852d01" (UID: "2dee4b83-5bd8-4413-bb33-c80cd4852d01"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:34.882222 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.882181 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:26:34.882222 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.882205 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:26:34.882222 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.882214 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2dee4b83-5bd8-4413-bb33-c80cd4852d01-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:26:34.882222 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.882223 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:26:34.882577 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.882233 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2dee4b83-5bd8-4413-bb33-c80cd4852d01-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:26:34.882577 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:34.882246 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dz5k\" (UniqueName: \"kubernetes.io/projected/2dee4b83-5bd8-4413-bb33-c80cd4852d01-kube-api-access-8dz5k\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:26:35.064440 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:35.064399 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n"] Apr 16 14:26:35.068351 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:35.068325 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-77999bd997-djn6n"] Apr 16 14:26:35.155938 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:35.155901 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" path="/var/lib/kubelet/pods/2dee4b83-5bd8-4413-bb33-c80cd4852d01/volumes" Apr 16 14:26:37.743528 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:37.743453 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:37.744347 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:37.744323 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:37.744652 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:37.744615 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:26:37.758948 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:37.758912 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:37.759122 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:37.758962 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:26:37.760382 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:37.760352 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:26:37.765132 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:37.765102 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:26:47.743949 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:47.743888 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:26:47.759604 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:47.759564 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:26:57.743325 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:57.743256 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:26:57.758959 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:26:57.758920 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:27:07.744006 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:07.743957 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:27:07.759148 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:07.759084 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:27:17.743508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:17.743447 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:27:17.758685 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:17.758646 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:27:27.743179 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:27.743129 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:27:27.759866 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:27.759823 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:27:37.743202 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:37.743149 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:27:37.758689 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:37.758650 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:27:47.743826 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:47.743770 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:27:47.759372 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:47.759337 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:27:57.744090 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:57.743985 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:27:57.759431 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:27:57.759391 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:28:07.743138 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:07.743089 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:28:07.759133 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:07.759094 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:28:17.743830 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:17.743780 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:28:17.759561 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:17.759525 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:28:27.743339 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:27.743286 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:28:27.758986 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:27.758948 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:28:37.743044 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:37.742992 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:28:37.759861 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:37.759821 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:28:47.743969 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:47.743918 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:28:47.758891 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:47.758853 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:28:57.743770 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:57.743715 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.60:8001/health\": dial tcp 10.134.0.60:8001: connect: connection refused" Apr 16 14:28:57.759197 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:28:57.759155 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8000/health\": dial tcp 10.134.0.61:8000: connect: connection refused" Apr 16 14:29:07.752567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:07.752536 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:29:07.764528 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:07.764502 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:29:07.768949 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:07.768919 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:29:07.777234 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:07.777209 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:29:20.466580 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:20.466543 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9"] Apr 16 14:29:20.467052 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:20.466810 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" containerID="cri-o://8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9" gracePeriod=30 Apr 16 14:29:20.474736 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:20.474708 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq"] Apr 16 14:29:20.475134 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:20.475077 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" containerID="cri-o://21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7" gracePeriod=30 Apr 16 14:29:36.752873 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:36.752837 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:36.789492 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:36.789460 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:36.796840 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:36.796815 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:36.809420 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:36.809394 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:36.829105 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:36.829075 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:36.838869 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:36.838842 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:37.207670 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:37.207640 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:29:37.213450 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:37.213424 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:29:37.846234 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:37.846205 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:37.872608 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:37.872581 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:37.879506 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:37.879483 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:37.890216 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:37.890200 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:37.909520 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:37.909496 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:37.918539 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:37.918520 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:38.916569 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:38.916540 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:38.938554 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:38.938519 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:38.948365 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:38.948345 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:38.960151 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:38.960120 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:38.979142 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:38.979119 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:38.988691 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:38.988668 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:39.955654 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:39.955615 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:39.976645 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:39.976617 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:39.982810 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:39.982788 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:39.992688 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:39.992667 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:40.010475 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:40.010444 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:40.019002 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:40.018983 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:40.987691 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:40.987663 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:41.011844 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:41.011818 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:41.018430 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:41.018414 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:41.029358 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:41.029342 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:41.047564 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:41.047548 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:41.063862 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:41.063841 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:42.027692 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:42.027667 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:42.060713 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:42.060685 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:42.081954 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:42.081928 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:42.101590 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:42.101549 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:42.132106 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:42.132081 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:42.154763 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:42.154741 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:43.151214 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:43.151170 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:43.174500 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:43.174473 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:43.182690 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:43.182668 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:43.193510 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:43.193488 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:43.215632 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:43.215606 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:43.225260 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:43.225241 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:44.219559 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:44.219532 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:44.243121 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:44.243096 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:44.257224 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:44.257201 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:44.273061 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:44.273004 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:44.300163 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:44.300136 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:44.318639 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:44.318606 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:45.343619 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:45.343595 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:45.365318 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:45.365287 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:45.374483 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:45.374456 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:45.385315 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:45.385295 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:45.402806 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:45.402781 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:45.415511 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:45.415476 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:46.375371 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:46.375338 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:46.399205 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:46.399168 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:46.405594 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:46.405572 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:46.416043 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:46.416023 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:46.434557 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:46.434537 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:46.443290 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:46.443257 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:47.408148 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:47.408118 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:47.429309 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:47.429260 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:47.440009 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:47.439985 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:47.455602 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:47.455577 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:47.474846 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:47.474822 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:47.481758 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:47.481735 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:48.490466 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:48.490433 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:48.514895 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:48.514870 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:48.528227 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:48.528207 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:48.538452 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:48.538419 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:48.559466 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:48.559443 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:48.568449 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:48.568419 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:49.598130 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:49.598100 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:49.620713 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:49.620684 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:49.628223 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:49.628202 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:49.638855 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:49.638832 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:49.657252 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:49.657232 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:49.666958 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:49.666939 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:50.475923 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.475849 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="llm-d-routing-sidecar" containerID="cri-o://c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452" gracePeriod=2 Apr 16 14:29:50.644244 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.644211 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-wbg65_ce99772a-304e-4452-bff7-1497e94435cd/istio-proxy/0.log" Apr 16 14:29:50.665446 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.665420 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:50.671791 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.671768 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/llm-d-routing-sidecar/0.log" Apr 16 14:29:50.681674 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.681648 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/storage-initializer/0.log" Apr 16 14:29:50.700463 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.700439 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/main/0.log" Apr 16 14:29:50.709896 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.709876 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9_035ca082-6c53-4429-ba8f-c54e6cbfbc94/storage-initializer/0.log" Apr 16 14:29:50.894756 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.894732 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:50.895398 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.895381 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:29:50.898077 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.898062 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:29:50.988555 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988456 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-model-cache\") pod \"9e845b36-1515-4741-8a77-993cd22eb17b\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " Apr 16 14:29:50.988555 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988517 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-dshm\") pod \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " Apr 16 14:29:50.988555 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988546 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-model-cache\") pod \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " Apr 16 14:29:50.988852 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988567 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-dshm\") pod \"9e845b36-1515-4741-8a77-993cd22eb17b\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " Apr 16 14:29:50.988852 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988591 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-home\") pod \"9e845b36-1515-4741-8a77-993cd22eb17b\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " Apr 16 14:29:50.988852 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988609 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9e845b36-1515-4741-8a77-993cd22eb17b-tls-certs\") pod \"9e845b36-1515-4741-8a77-993cd22eb17b\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " Apr 16 14:29:50.988852 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988628 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwjtj\" (UniqueName: \"kubernetes.io/projected/9e845b36-1515-4741-8a77-993cd22eb17b-kube-api-access-xwjtj\") pod \"9e845b36-1515-4741-8a77-993cd22eb17b\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " Apr 16 14:29:50.988852 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988659 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-kserve-provision-location\") pod \"9e845b36-1515-4741-8a77-993cd22eb17b\" (UID: \"9e845b36-1515-4741-8a77-993cd22eb17b\") " Apr 16 14:29:50.988852 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988684 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-home\") pod \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " Apr 16 14:29:50.988852 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988832 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-model-cache" (OuterVolumeSpecName: "model-cache") pod "9e845b36-1515-4741-8a77-993cd22eb17b" (UID: "9e845b36-1515-4741-8a77-993cd22eb17b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:50.989218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.988865 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-model-cache" (OuterVolumeSpecName: "model-cache") pod "035ca082-6c53-4429-ba8f-c54e6cbfbc94" (UID: "035ca082-6c53-4429-ba8f-c54e6cbfbc94"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:50.989218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.989113 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lqfs\" (UniqueName: \"kubernetes.io/projected/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kube-api-access-5lqfs\") pod \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " Apr 16 14:29:50.989218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.989128 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-home" (OuterVolumeSpecName: "home") pod "9e845b36-1515-4741-8a77-993cd22eb17b" (UID: "9e845b36-1515-4741-8a77-993cd22eb17b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:50.989218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.989133 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-home" (OuterVolumeSpecName: "home") pod "035ca082-6c53-4429-ba8f-c54e6cbfbc94" (UID: "035ca082-6c53-4429-ba8f-c54e6cbfbc94"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:50.989218 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.989165 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/035ca082-6c53-4429-ba8f-c54e6cbfbc94-tls-certs\") pod \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " Apr 16 14:29:50.989523 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.989225 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kserve-provision-location\") pod \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\" (UID: \"035ca082-6c53-4429-ba8f-c54e6cbfbc94\") " Apr 16 14:29:50.989582 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.989542 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:50.989582 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.989562 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:50.989582 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.989574 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-home\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:50.989733 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.989587 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-model-cache\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:50.991718 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.991672 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-dshm" (OuterVolumeSpecName: "dshm") pod "035ca082-6c53-4429-ba8f-c54e6cbfbc94" (UID: "035ca082-6c53-4429-ba8f-c54e6cbfbc94"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:50.991843 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.991751 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kube-api-access-5lqfs" (OuterVolumeSpecName: "kube-api-access-5lqfs") pod "035ca082-6c53-4429-ba8f-c54e6cbfbc94" (UID: "035ca082-6c53-4429-ba8f-c54e6cbfbc94"). InnerVolumeSpecName "kube-api-access-5lqfs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:29:50.991843 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.991786 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-dshm" (OuterVolumeSpecName: "dshm") pod "9e845b36-1515-4741-8a77-993cd22eb17b" (UID: "9e845b36-1515-4741-8a77-993cd22eb17b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:50.992084 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.992067 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e845b36-1515-4741-8a77-993cd22eb17b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9e845b36-1515-4741-8a77-993cd22eb17b" (UID: "9e845b36-1515-4741-8a77-993cd22eb17b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:29:50.992135 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.992089 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/035ca082-6c53-4429-ba8f-c54e6cbfbc94-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "035ca082-6c53-4429-ba8f-c54e6cbfbc94" (UID: "035ca082-6c53-4429-ba8f-c54e6cbfbc94"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:29:50.992490 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:50.992467 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e845b36-1515-4741-8a77-993cd22eb17b-kube-api-access-xwjtj" (OuterVolumeSpecName: "kube-api-access-xwjtj") pod "9e845b36-1515-4741-8a77-993cd22eb17b" (UID: "9e845b36-1515-4741-8a77-993cd22eb17b"). InnerVolumeSpecName "kube-api-access-xwjtj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:29:51.058536 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.058483 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9e845b36-1515-4741-8a77-993cd22eb17b" (UID: "9e845b36-1515-4741-8a77-993cd22eb17b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:51.064488 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.064451 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "035ca082-6c53-4429-ba8f-c54e6cbfbc94" (UID: "035ca082-6c53-4429-ba8f-c54e6cbfbc94"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:51.090233 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.090196 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:51.090233 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.090230 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/035ca082-6c53-4429-ba8f-c54e6cbfbc94-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:51.090233 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.090239 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-dshm\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:51.090474 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.090247 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9e845b36-1515-4741-8a77-993cd22eb17b-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:51.090474 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.090256 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwjtj\" (UniqueName: \"kubernetes.io/projected/9e845b36-1515-4741-8a77-993cd22eb17b-kube-api-access-xwjtj\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:51.090474 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.090290 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e845b36-1515-4741-8a77-993cd22eb17b-kserve-provision-location\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:51.090474 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.090300 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5lqfs\" (UniqueName: \"kubernetes.io/projected/035ca082-6c53-4429-ba8f-c54e6cbfbc94-kube-api-access-5lqfs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:51.090474 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.090308 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/035ca082-6c53-4429-ba8f-c54e6cbfbc94-tls-certs\") on node \"ip-10-0-129-3.ec2.internal\" DevicePath \"\"" Apr 16 14:29:51.571433 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.571388 2580 generic.go:358] "Generic (PLEG): container finished" podID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerID="8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9" exitCode=137 Apr 16 14:29:51.571628 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.571489 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" Apr 16 14:29:51.571628 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.571512 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" event={"ID":"035ca082-6c53-4429-ba8f-c54e6cbfbc94","Type":"ContainerDied","Data":"8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9"} Apr 16 14:29:51.571628 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.571555 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9" event={"ID":"035ca082-6c53-4429-ba8f-c54e6cbfbc94","Type":"ContainerDied","Data":"0cd52135be85d473059db4028a36dd4c2bd449e962cf5127871e0bf98bbdeba5"} Apr 16 14:29:51.571628 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.571572 2580 scope.go:117] "RemoveContainer" containerID="8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9" Apr 16 14:29:51.572922 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.572905 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-559fccd7cb-kwfkq_9e845b36-1515-4741-8a77-993cd22eb17b/main/0.log" Apr 16 14:29:51.573590 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.573564 2580 generic.go:358] "Generic (PLEG): container finished" podID="9e845b36-1515-4741-8a77-993cd22eb17b" containerID="21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7" exitCode=137 Apr 16 14:29:51.573590 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.573588 2580 generic.go:358] "Generic (PLEG): container finished" podID="9e845b36-1515-4741-8a77-993cd22eb17b" containerID="c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452" exitCode=0 Apr 16 14:29:51.573782 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.573654 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" event={"ID":"9e845b36-1515-4741-8a77-993cd22eb17b","Type":"ContainerDied","Data":"21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7"} Apr 16 14:29:51.573782 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.573679 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" event={"ID":"9e845b36-1515-4741-8a77-993cd22eb17b","Type":"ContainerDied","Data":"c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452"} Apr 16 14:29:51.573782 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.573696 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" event={"ID":"9e845b36-1515-4741-8a77-993cd22eb17b","Type":"ContainerDied","Data":"d89814b19d4e17fab07d59d62dda3b9473c307b1cc9879f511b4b225245498b3"} Apr 16 14:29:51.573782 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.573712 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq" Apr 16 14:29:51.592625 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.592258 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq"] Apr 16 14:29:51.592625 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.592467 2580 scope.go:117] "RemoveContainer" containerID="26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008" Apr 16 14:29:51.596396 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.596372 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-559fccd7cb-kwfkq"] Apr 16 14:29:51.606737 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.606706 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9"] Apr 16 14:29:51.610146 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.610121 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7ff6579bf7-msls9"] Apr 16 14:29:51.665370 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.665345 2580 scope.go:117] "RemoveContainer" containerID="8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9" Apr 16 14:29:51.665712 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:29:51.665693 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9\": container with ID starting with 8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9 not found: ID does not exist" containerID="8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9" Apr 16 14:29:51.665756 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.665723 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9"} err="failed to get container status \"8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9\": rpc error: code = NotFound desc = could not find container \"8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9\": container with ID starting with 8ecd65ab6626f2fc261884b8190ff150b2f4702e5e6151ee57696d7ea883dac9 not found: ID does not exist" Apr 16 14:29:51.665756 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.665746 2580 scope.go:117] "RemoveContainer" containerID="26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008" Apr 16 14:29:51.666043 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:29:51.666023 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008\": container with ID starting with 26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008 not found: ID does not exist" containerID="26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008" Apr 16 14:29:51.666100 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.666050 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008"} err="failed to get container status \"26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008\": rpc error: code = NotFound desc = could not find container \"26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008\": container with ID starting with 26d9a4067d610c3019af3ca84e302a3621f8e60c3852a2fb58d4a767cfa8b008 not found: ID does not exist" Apr 16 14:29:51.666100 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.666067 2580 scope.go:117] "RemoveContainer" containerID="21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7" Apr 16 14:29:51.686646 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.686626 2580 scope.go:117] "RemoveContainer" containerID="36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80" Apr 16 14:29:51.756079 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.756056 2580 scope.go:117] "RemoveContainer" containerID="c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452" Apr 16 14:29:51.764358 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.764326 2580 scope.go:117] "RemoveContainer" containerID="21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7" Apr 16 14:29:51.764664 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:29:51.764644 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7\": container with ID starting with 21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7 not found: ID does not exist" containerID="21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7" Apr 16 14:29:51.764745 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.764677 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7"} err="failed to get container status \"21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7\": rpc error: code = NotFound desc = could not find container \"21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7\": container with ID starting with 21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7 not found: ID does not exist" Apr 16 14:29:51.764745 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.764705 2580 scope.go:117] "RemoveContainer" containerID="36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80" Apr 16 14:29:51.764959 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:29:51.764940 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80\": container with ID starting with 36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80 not found: ID does not exist" containerID="36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80" Apr 16 14:29:51.765006 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.764966 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80"} err="failed to get container status \"36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80\": rpc error: code = NotFound desc = could not find container \"36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80\": container with ID starting with 36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80 not found: ID does not exist" Apr 16 14:29:51.765006 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.764983 2580 scope.go:117] "RemoveContainer" containerID="c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452" Apr 16 14:29:51.765258 ip-10-0-129-3 kubenswrapper[2580]: E0416 14:29:51.765239 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452\": container with ID starting with c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452 not found: ID does not exist" containerID="c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452" Apr 16 14:29:51.765356 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.765280 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452"} err="failed to get container status \"c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452\": rpc error: code = NotFound desc = could not find container \"c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452\": container with ID starting with c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452 not found: ID does not exist" Apr 16 14:29:51.765356 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.765300 2580 scope.go:117] "RemoveContainer" containerID="21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7" Apr 16 14:29:51.765578 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.765529 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7"} err="failed to get container status \"21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7\": rpc error: code = NotFound desc = could not find container \"21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7\": container with ID starting with 21760630dd46ed09ed07d7d1f26a946c9c40ec7c87726c516a548115251cb2a7 not found: ID does not exist" Apr 16 14:29:51.765578 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.765546 2580 scope.go:117] "RemoveContainer" containerID="36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80" Apr 16 14:29:51.765814 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.765789 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80"} err="failed to get container status \"36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80\": rpc error: code = NotFound desc = could not find container \"36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80\": container with ID starting with 36e9be3ab6226725f4aa608c7c9217437015547a8335db28208350681746da80 not found: ID does not exist" Apr 16 14:29:51.765888 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.765818 2580 scope.go:117] "RemoveContainer" containerID="c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452" Apr 16 14:29:51.766069 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:51.766047 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452"} err="failed to get container status \"c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452\": rpc error: code = NotFound desc = could not find container \"c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452\": container with ID starting with c0ca5e3249cea969ec1829d8d646e3020ce684adcca1be6aa374138041f9a452 not found: ID does not exist" Apr 16 14:29:53.156898 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:53.156867 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" path="/var/lib/kubelet/pods/035ca082-6c53-4429-ba8f-c54e6cbfbc94/volumes" Apr 16 14:29:53.157321 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:53.157304 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" path="/var/lib/kubelet/pods/9e845b36-1515-4741-8a77-993cd22eb17b/volumes" Apr 16 14:29:53.467007 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:53.466916 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-st69q_8efcabe6-6814-420b-9c29-97a711033251/manager/0.log" Apr 16 14:29:53.481873 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:53.481848 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-l48sb_68459a75-dbc4-442c-9ed9-f18277a1b21d/kuadrant-console-plugin/0.log" Apr 16 14:29:53.557033 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:53.557005 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-m2pvj_69a494be-503e-4309-a889-aed428c35e00/limitador/0.log" Apr 16 14:29:55.911179 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.911141 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8g7hp/must-gather-5cd7l"] Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912496 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="llm-d-routing-sidecar" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912527 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="llm-d-routing-sidecar" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912550 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="storage-initializer" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912560 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="storage-initializer" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912571 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912580 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912603 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912611 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912650 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="storage-initializer" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912659 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="storage-initializer" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912669 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="storage-initializer" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912677 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="storage-initializer" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912710 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912718 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912871 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="main" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912893 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e845b36-1515-4741-8a77-993cd22eb17b" containerName="llm-d-routing-sidecar" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912902 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2dee4b83-5bd8-4413-bb33-c80cd4852d01" containerName="main" Apr 16 14:29:55.914299 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.912923 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="035ca082-6c53-4429-ba8f-c54e6cbfbc94" containerName="main" Apr 16 14:29:55.923182 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.923147 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8g7hp/must-gather-5cd7l"] Apr 16 14:29:55.923334 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.923250 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8g7hp/must-gather-5cd7l" Apr 16 14:29:55.925870 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.925848 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8g7hp\"/\"openshift-service-ca.crt\"" Apr 16 14:29:55.926718 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.926700 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8g7hp\"/\"kube-root-ca.crt\"" Apr 16 14:29:55.926836 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:55.926705 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8g7hp\"/\"default-dockercfg-9xm4f\"" Apr 16 14:29:56.034382 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:56.034342 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlvw9\" (UniqueName: \"kubernetes.io/projected/05037ad6-4ae7-4774-a17c-5790987e20cd-kube-api-access-nlvw9\") pod \"must-gather-5cd7l\" (UID: \"05037ad6-4ae7-4774-a17c-5790987e20cd\") " pod="openshift-must-gather-8g7hp/must-gather-5cd7l" Apr 16 14:29:56.034561 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:56.034415 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05037ad6-4ae7-4774-a17c-5790987e20cd-must-gather-output\") pod \"must-gather-5cd7l\" (UID: \"05037ad6-4ae7-4774-a17c-5790987e20cd\") " pod="openshift-must-gather-8g7hp/must-gather-5cd7l" Apr 16 14:29:56.135036 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:56.134997 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlvw9\" (UniqueName: \"kubernetes.io/projected/05037ad6-4ae7-4774-a17c-5790987e20cd-kube-api-access-nlvw9\") pod \"must-gather-5cd7l\" (UID: \"05037ad6-4ae7-4774-a17c-5790987e20cd\") " pod="openshift-must-gather-8g7hp/must-gather-5cd7l" Apr 16 14:29:56.135241 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:56.135062 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05037ad6-4ae7-4774-a17c-5790987e20cd-must-gather-output\") pod \"must-gather-5cd7l\" (UID: \"05037ad6-4ae7-4774-a17c-5790987e20cd\") " pod="openshift-must-gather-8g7hp/must-gather-5cd7l" Apr 16 14:29:56.135447 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:56.135427 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05037ad6-4ae7-4774-a17c-5790987e20cd-must-gather-output\") pod \"must-gather-5cd7l\" (UID: \"05037ad6-4ae7-4774-a17c-5790987e20cd\") " pod="openshift-must-gather-8g7hp/must-gather-5cd7l" Apr 16 14:29:56.143490 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:56.143466 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlvw9\" (UniqueName: \"kubernetes.io/projected/05037ad6-4ae7-4774-a17c-5790987e20cd-kube-api-access-nlvw9\") pod \"must-gather-5cd7l\" (UID: \"05037ad6-4ae7-4774-a17c-5790987e20cd\") " pod="openshift-must-gather-8g7hp/must-gather-5cd7l" Apr 16 14:29:56.233015 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:56.232926 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8g7hp/must-gather-5cd7l" Apr 16 14:29:56.364361 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:56.364328 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8g7hp/must-gather-5cd7l"] Apr 16 14:29:56.366102 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:29:56.366075 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05037ad6_4ae7_4774_a17c_5790987e20cd.slice/crio-49166ddc4f426f5907b14582b84aa90690272c6bc1f81796bed1a3ed029f948c WatchSource:0}: Error finding container 49166ddc4f426f5907b14582b84aa90690272c6bc1f81796bed1a3ed029f948c: Status 404 returned error can't find the container with id 49166ddc4f426f5907b14582b84aa90690272c6bc1f81796bed1a3ed029f948c Apr 16 14:29:56.368306 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:56.368289 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:29:56.596833 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:56.596739 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8g7hp/must-gather-5cd7l" event={"ID":"05037ad6-4ae7-4774-a17c-5790987e20cd","Type":"ContainerStarted","Data":"49166ddc4f426f5907b14582b84aa90690272c6bc1f81796bed1a3ed029f948c"} Apr 16 14:29:57.601699 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:57.601661 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8g7hp/must-gather-5cd7l" event={"ID":"05037ad6-4ae7-4774-a17c-5790987e20cd","Type":"ContainerStarted","Data":"392097664e242e60b751984257eb6f800b9f540889bccbd688b9fc680fe17f10"} Apr 16 14:29:58.607949 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:58.607915 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8g7hp/must-gather-5cd7l" event={"ID":"05037ad6-4ae7-4774-a17c-5790987e20cd","Type":"ContainerStarted","Data":"4bfe58955371aaba156241fda2a6ec80cb4664add6a86544b0eec0b0ceb6b188"} Apr 16 14:29:58.628503 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:29:58.628447 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8g7hp/must-gather-5cd7l" podStartSLOduration=2.569008369 podStartE2EDuration="3.628432543s" podCreationTimestamp="2026-04-16 14:29:55 +0000 UTC" firstStartedPulling="2026-04-16 14:29:56.368421627 +0000 UTC m=+1819.804587338" lastFinishedPulling="2026-04-16 14:29:57.427845797 +0000 UTC m=+1820.864011512" observedRunningTime="2026-04-16 14:29:58.626055851 +0000 UTC m=+1822.062221579" watchObservedRunningTime="2026-04-16 14:29:58.628432543 +0000 UTC m=+1822.064598271" Apr 16 14:30:01.172462 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:01.172421 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-245wx_043c6d8a-c412-4df7-8745-09796830b9f1/global-pull-secret-syncer/0.log" Apr 16 14:30:01.270322 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:01.270291 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-69zrg_4ae873fc-9131-409f-a02c-21eb56f20fed/konnectivity-agent/0.log" Apr 16 14:30:01.372366 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:01.372340 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-3.ec2.internal_0a36747d36072480352ac0833ae0f93c/haproxy/0.log" Apr 16 14:30:05.498211 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:05.498064 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-st69q_8efcabe6-6814-420b-9c29-97a711033251/manager/0.log" Apr 16 14:30:05.532243 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:05.532208 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-l48sb_68459a75-dbc4-442c-9ed9-f18277a1b21d/kuadrant-console-plugin/0.log" Apr 16 14:30:05.623135 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:05.623104 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-m2pvj_69a494be-503e-4309-a889-aed428c35e00/limitador/0.log" Apr 16 14:30:06.836773 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:06.836733 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-ppndx_d9bf088f-26e6-41c0-bdc3-ea00c62c1255/cluster-monitoring-operator/0.log" Apr 16 14:30:07.157042 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.157010 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qkptn_7127a2d6-0603-465d-a090-ca81178ba98d/node-exporter/0.log" Apr 16 14:30:07.179939 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.179858 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qkptn_7127a2d6-0603-465d-a090-ca81178ba98d/kube-rbac-proxy/0.log" Apr 16 14:30:07.203195 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.203171 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qkptn_7127a2d6-0603-465d-a090-ca81178ba98d/init-textfile/0.log" Apr 16 14:30:07.238940 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.238891 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-gc9ff_48c8c7d4-4455-4581-866d-80f4d5c04319/kube-rbac-proxy-main/0.log" Apr 16 14:30:07.266577 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.266549 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-gc9ff_48c8c7d4-4455-4581-866d-80f4d5c04319/kube-rbac-proxy-self/0.log" Apr 16 14:30:07.289340 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.289315 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-gc9ff_48c8c7d4-4455-4581-866d-80f4d5c04319/openshift-state-metrics/0.log" Apr 16 14:30:07.681831 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.681521 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8554b6c6b6-hkhrj_86145381-870a-416e-a665-1ef1232225b4/thanos-query/0.log" Apr 16 14:30:07.703896 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.703871 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8554b6c6b6-hkhrj_86145381-870a-416e-a665-1ef1232225b4/kube-rbac-proxy-web/0.log" Apr 16 14:30:07.730751 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.730727 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8554b6c6b6-hkhrj_86145381-870a-416e-a665-1ef1232225b4/kube-rbac-proxy/0.log" Apr 16 14:30:07.764984 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.764938 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8554b6c6b6-hkhrj_86145381-870a-416e-a665-1ef1232225b4/prom-label-proxy/0.log" Apr 16 14:30:07.800819 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.800758 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8554b6c6b6-hkhrj_86145381-870a-416e-a665-1ef1232225b4/kube-rbac-proxy-rules/0.log" Apr 16 14:30:07.826654 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:07.826627 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8554b6c6b6-hkhrj_86145381-870a-416e-a665-1ef1232225b4/kube-rbac-proxy-metrics/0.log" Apr 16 14:30:09.746937 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:09.746909 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/1.log" Apr 16 14:30:09.754685 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:09.754654 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-mqmmd_49030659-7d98-49ee-844f-41ff4d22d449/console-operator/2.log" Apr 16 14:30:10.200829 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.200790 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cdfcfc699-dlkvn_60be6dad-0fd9-4103-b4b3-7e33196de659/console/0.log" Apr 16 14:30:10.233682 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.233651 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-x7nxr_6811b550-4e43-4c2d-a27b-a962e50de90d/download-server/0.log" Apr 16 14:30:10.363989 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.363951 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh"] Apr 16 14:30:10.370137 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.370107 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.377255 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.377219 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh"] Apr 16 14:30:10.506373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.506255 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-proc\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.506373 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.506334 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-podres\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.506623 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.506407 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm756\" (UniqueName: \"kubernetes.io/projected/8aea3725-7bd9-4743-8441-359187a22db1-kube-api-access-jm756\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.506623 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.506522 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-sys\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.506623 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.506549 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-lib-modules\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.608077 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.608037 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-proc\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.608374 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.608339 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-podres\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.608609 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.608593 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jm756\" (UniqueName: \"kubernetes.io/projected/8aea3725-7bd9-4743-8441-359187a22db1-kube-api-access-jm756\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.608846 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.608817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-proc\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.608946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.608874 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-podres\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.608946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.608872 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-sys\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.608946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.608914 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-sys\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.608946 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.608924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-lib-modules\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.609085 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.609013 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8aea3725-7bd9-4743-8441-359187a22db1-lib-modules\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.616848 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.616821 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm756\" (UniqueName: \"kubernetes.io/projected/8aea3725-7bd9-4743-8441-359187a22db1-kube-api-access-jm756\") pod \"perf-node-gather-daemonset-gcnjh\" (UID: \"8aea3725-7bd9-4743-8441-359187a22db1\") " pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.683392 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.683360 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:10.843804 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:10.843777 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh"] Apr 16 14:30:10.845933 ip-10-0-129-3 kubenswrapper[2580]: W0416 14:30:10.845903 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8aea3725_7bd9_4743_8441_359187a22db1.slice/crio-e6217c74136c636a28359ceff154fa88183e9aee0972644c8236864ac5fde8b9 WatchSource:0}: Error finding container e6217c74136c636a28359ceff154fa88183e9aee0972644c8236864ac5fde8b9: Status 404 returned error can't find the container with id e6217c74136c636a28359ceff154fa88183e9aee0972644c8236864ac5fde8b9 Apr 16 14:30:11.493397 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:11.493368 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s7ltv_6da34735-0aa6-4efc-88b2-81738c442f3f/dns/0.log" Apr 16 14:30:11.516626 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:11.516593 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s7ltv_6da34735-0aa6-4efc-88b2-81738c442f3f/kube-rbac-proxy/0.log" Apr 16 14:30:11.702614 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:11.702573 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" event={"ID":"8aea3725-7bd9-4743-8441-359187a22db1","Type":"ContainerStarted","Data":"48025ffa51322b902e1d45e59166519455a99ee05790b7cbba179edf73896bb6"} Apr 16 14:30:11.702614 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:11.702616 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" event={"ID":"8aea3725-7bd9-4743-8441-359187a22db1","Type":"ContainerStarted","Data":"e6217c74136c636a28359ceff154fa88183e9aee0972644c8236864ac5fde8b9"} Apr 16 14:30:11.702822 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:11.702638 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:11.727176 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:11.727116 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" podStartSLOduration=1.727098724 podStartE2EDuration="1.727098724s" podCreationTimestamp="2026-04-16 14:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:11.724425274 +0000 UTC m=+1835.160591002" watchObservedRunningTime="2026-04-16 14:30:11.727098724 +0000 UTC m=+1835.163264451" Apr 16 14:30:11.735441 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:11.735421 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-thzkh_f2a769c2-0080-45a0-983a-5c1bcf200faf/dns-node-resolver/0.log" Apr 16 14:30:12.180727 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:12.180695 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-54bd994b84-872wz_5ba4c449-3beb-4e89-93c7-614b31fdfa9d/registry/0.log" Apr 16 14:30:12.188549 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:12.188523 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-54bd994b84-872wz_5ba4c449-3beb-4e89-93c7-614b31fdfa9d/registry/1.log" Apr 16 14:30:12.261444 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:12.261418 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xwnfr_e2dce2aa-0a8d-4795-a48e-c9cb5cf26cfd/node-ca/0.log" Apr 16 14:30:13.671260 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:13.671232 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bnz8h_4f0fe291-cfdb-4a2a-afac-62a29d0fbfa8/serve-healthcheck-canary/0.log" Apr 16 14:30:14.173629 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:14.173596 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-24gbs_bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38/insights-operator/0.log" Apr 16 14:30:14.174300 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:14.174282 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-24gbs_bb1e49f0-eb8d-46d6-a5c8-7bfad3071e38/insights-operator/1.log" Apr 16 14:30:14.361236 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:14.361207 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n7x2g_25e0ee69-7f8f-410c-a4db-ac2b4e7e494b/kube-rbac-proxy/0.log" Apr 16 14:30:14.390567 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:14.390540 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n7x2g_25e0ee69-7f8f-410c-a4db-ac2b4e7e494b/exporter/0.log" Apr 16 14:30:14.415796 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:14.415768 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n7x2g_25e0ee69-7f8f-410c-a4db-ac2b4e7e494b/extractor/0.log" Apr 16 14:30:16.957219 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:16.957189 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5b8748f956-2rcns_4272c25d-1e8a-4e1c-b621-21b6d0d7222e/manager/0.log" Apr 16 14:30:17.720190 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:17.720166 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8g7hp/perf-node-gather-daemonset-gcnjh" Apr 16 14:30:17.906680 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:17.906649 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-wdq45_48fcba8d-e02e-401a-a8c8-04df4abb087a/s3-init/0.log" Apr 16 14:30:23.028888 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:23.028850 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-wv92n_93bf1779-6f22-4509-a332-64a1d071a5a0/kube-storage-version-migrator-operator/1.log" Apr 16 14:30:23.030311 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:23.030290 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-wv92n_93bf1779-6f22-4509-a332-64a1d071a5a0/kube-storage-version-migrator-operator/0.log" Apr 16 14:30:24.249409 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:24.249372 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-86xbr_acddcee2-ab55-4a6b-8b63-9793ffc842d3/kube-multus-additional-cni-plugins/0.log" Apr 16 14:30:24.276197 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:24.276165 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-86xbr_acddcee2-ab55-4a6b-8b63-9793ffc842d3/egress-router-binary-copy/0.log" Apr 16 14:30:24.300562 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:24.300538 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-86xbr_acddcee2-ab55-4a6b-8b63-9793ffc842d3/cni-plugins/0.log" Apr 16 14:30:24.326375 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:24.326350 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-86xbr_acddcee2-ab55-4a6b-8b63-9793ffc842d3/bond-cni-plugin/0.log" Apr 16 14:30:24.352758 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:24.352719 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-86xbr_acddcee2-ab55-4a6b-8b63-9793ffc842d3/routeoverride-cni/0.log" Apr 16 14:30:24.380874 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:24.380839 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-86xbr_acddcee2-ab55-4a6b-8b63-9793ffc842d3/whereabouts-cni-bincopy/0.log" Apr 16 14:30:24.409656 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:24.409627 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-86xbr_acddcee2-ab55-4a6b-8b63-9793ffc842d3/whereabouts-cni/0.log" Apr 16 14:30:24.636171 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:24.636144 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h7xrd_5fa66a35-a4c8-4e4b-a65c-58bfea71f741/kube-multus/0.log" Apr 16 14:30:24.753059 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:24.753028 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cptr8_aef30458-23ff-40ab-ad5a-ae58af58ca82/network-metrics-daemon/0.log" Apr 16 14:30:24.773674 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:24.773642 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cptr8_aef30458-23ff-40ab-ad5a-ae58af58ca82/kube-rbac-proxy/0.log" Apr 16 14:30:26.312851 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:26.312819 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgf8n_c6aa762b-ffdd-496f-8282-ff45ebe8c26c/ovn-controller/0.log" Apr 16 14:30:26.341597 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:26.341569 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgf8n_c6aa762b-ffdd-496f-8282-ff45ebe8c26c/ovn-acl-logging/0.log" Apr 16 14:30:26.370856 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:26.370829 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgf8n_c6aa762b-ffdd-496f-8282-ff45ebe8c26c/kube-rbac-proxy-node/0.log" Apr 16 14:30:26.395152 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:26.395123 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgf8n_c6aa762b-ffdd-496f-8282-ff45ebe8c26c/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:30:26.416940 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:26.416914 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgf8n_c6aa762b-ffdd-496f-8282-ff45ebe8c26c/northd/0.log" Apr 16 14:30:26.441670 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:26.441647 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgf8n_c6aa762b-ffdd-496f-8282-ff45ebe8c26c/nbdb/0.log" Apr 16 14:30:26.464941 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:26.464914 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgf8n_c6aa762b-ffdd-496f-8282-ff45ebe8c26c/sbdb/0.log" Apr 16 14:30:26.583508 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:26.583432 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgf8n_c6aa762b-ffdd-496f-8282-ff45ebe8c26c/ovnkube-controller/0.log" Apr 16 14:30:27.699179 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:27.699148 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-cwrlm_63d5ca74-da50-429d-abfb-de1dfd0f7646/check-endpoints/0.log" Apr 16 14:30:27.801716 ip-10-0-129-3 kubenswrapper[2580]: I0416 14:30:27.801676 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zg9zc_a4cc786e-e069-4dfc-9be8-98f1a73b9bcb/network-check-target-container/0.log"