Apr 17 16:28:39.410306 ip-10-0-134-142 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:28:39.410316 ip-10-0-134-142 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:28:39.410323 ip-10-0-134-142 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:28:39.410569 ip-10-0-134-142 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:28:49.484363 ip-10-0-134-142 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:28:49.484378 ip-10-0-134-142 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 5f0c8023448d463ab6988f3a1a7c1948 -- Apr 17 16:31:20.407846 ip-10-0-134-142 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:31:20.886813 ip-10-0-134-142 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:20.886813 ip-10-0-134-142 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:31:20.886813 ip-10-0-134-142 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:20.886813 ip-10-0-134-142 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:31:20.886813 ip-10-0-134-142 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:20.888658 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.888572 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:31:20.892367 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892353 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:20.892367 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892367 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892371 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892374 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892377 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892381 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892383 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892386 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892389 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892392 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892395 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892397 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892401 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892403 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892406 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892409 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892411 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892414 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892425 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892429 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892433 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:20.892428 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892436 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892439 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892442 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892444 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892447 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892450 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892452 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892455 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892458 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892460 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892463 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892465 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892468 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892470 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892473 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892475 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892478 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892482 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892486 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:20.892967 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892490 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892508 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892512 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892515 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892518 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892521 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892523 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892526 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892529 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892531 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892534 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892536 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892539 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892541 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892544 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892547 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892551 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892555 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892557 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892560 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:20.893427 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892562 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892565 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892567 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892570 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892572 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892575 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892577 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892580 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892582 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892585 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892588 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892590 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892594 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892596 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892599 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892601 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892604 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892607 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892610 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892612 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:20.893921 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892616 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892619 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892621 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892624 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892626 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.892629 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893000 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893006 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893009 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893012 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893015 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893018 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893020 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893023 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893025 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893028 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893031 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893033 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893036 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893038 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:20.894395 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893041 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893044 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893046 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893049 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893052 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893055 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893057 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893060 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893062 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893065 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893067 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893070 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893073 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893075 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893078 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893081 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893083 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893086 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893089 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893092 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:20.894882 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893095 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893097 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893100 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893103 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893105 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893108 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893111 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893114 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893117 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893119 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893122 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893125 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893127 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893130 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893132 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893135 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893138 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893141 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893143 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:20.895415 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893146 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893149 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893151 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893154 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893156 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893159 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893161 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893164 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893166 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893169 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893171 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893174 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893177 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893179 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893182 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893185 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893187 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893190 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893192 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893195 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:20.895904 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893198 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893201 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893203 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893206 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893208 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893211 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893213 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893216 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893219 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893221 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893224 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893229 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893232 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893318 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893328 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893338 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893344 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893351 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893356 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893360 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893365 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:31:20.896416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893369 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893372 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893375 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893379 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893382 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893385 2576 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893388 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893391 2576 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893394 2576 flags.go:64] FLAG: --cloud-config="" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893397 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893400 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893405 2576 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893408 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893411 2576 flags.go:64] FLAG: --config-dir="" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893414 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893418 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893422 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893425 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893430 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893434 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893438 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893441 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893444 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893448 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893451 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:31:20.896950 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893456 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893459 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893462 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893465 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893468 2576 flags.go:64] FLAG: --enable-server="true" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893470 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893475 2576 flags.go:64] FLAG: --event-burst="100" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893478 2576 flags.go:64] FLAG: --event-qps="50" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893481 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893485 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893488 2576 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893504 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893508 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893511 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893514 2576 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893517 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893521 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893524 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893527 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893530 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893533 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893536 2576 flags.go:64] FLAG: --feature-gates="" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893540 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893543 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893546 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:31:20.897574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893551 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893554 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893557 2576 flags.go:64] FLAG: --help="false" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893561 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-134-142.ec2.internal" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893564 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893567 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893575 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893579 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893582 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893586 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893589 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893592 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893595 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893597 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893601 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893604 2576 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893610 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893612 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893615 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893618 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893621 2576 flags.go:64] FLAG: --lock-file="" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893624 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893627 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893630 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:31:20.898175 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893640 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893643 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893646 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893649 2576 flags.go:64] FLAG: --logging-format="text" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893652 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893656 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893658 2576 flags.go:64] FLAG: --manifest-url="" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893661 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893668 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893671 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893675 2576 flags.go:64] FLAG: --max-pods="110" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893679 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893682 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893684 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893688 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893691 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893694 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893697 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893705 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893708 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893711 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893715 2576 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893718 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:31:20.898775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893724 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893727 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893732 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893735 2576 flags.go:64] FLAG: --port="10250" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893738 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893740 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02de1409b099baa95" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893744 2576 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893747 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893750 2576 flags.go:64] FLAG: --register-node="true" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893753 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893755 2576 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893759 2576 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893762 2576 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893765 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893768 2576 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893774 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893777 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893781 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893784 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893787 2576 flags.go:64] FLAG: --runonce="false" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893790 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893793 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893796 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893799 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893802 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893806 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:31:20.899332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893809 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893812 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893815 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893818 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893821 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893824 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893827 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893830 2576 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893834 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893839 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893842 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893844 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893849 2576 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893851 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893854 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893857 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893860 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893864 2576 flags.go:64] FLAG: --v="2" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893868 2576 flags.go:64] FLAG: --version="false" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893873 2576 flags.go:64] FLAG: --vmodule="" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893877 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.893880 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893975 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893980 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:20.899986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893983 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893985 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893988 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893991 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893994 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.893998 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894001 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894004 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894007 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894010 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894012 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894015 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894018 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894020 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894023 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894025 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894030 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894032 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894035 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894038 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:20.900589 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894040 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894043 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894045 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894048 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894050 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894053 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894056 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894058 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894061 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894063 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894066 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894070 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894072 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894075 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894078 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894080 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894082 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894085 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894087 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894090 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:20.901121 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894093 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894096 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894098 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894101 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894104 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894107 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894111 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894114 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894118 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894121 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894123 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894126 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894128 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894131 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894135 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894138 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894141 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894144 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894147 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:20.901623 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894150 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894152 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894155 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894158 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894161 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894164 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894166 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894169 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894171 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894174 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894176 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894179 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894182 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894184 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894187 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894189 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894192 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894195 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894198 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:20.902105 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894200 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:20.902582 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894203 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:20.902582 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894207 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:20.902582 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894209 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:20.902582 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894212 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:20.902582 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.894214 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:20.902582 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.894926 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:20.902817 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.902798 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:31:20.902849 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.902817 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:31:20.902879 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902865 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:20.902879 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902871 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:20.902879 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902874 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:20.902879 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902878 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902882 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902885 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902888 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902891 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902894 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902896 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902899 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902902 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902905 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902907 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902910 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902913 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902915 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902918 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902921 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902924 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902927 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902930 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902932 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:20.902986 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902935 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902937 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902940 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902943 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902946 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902948 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902951 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902953 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902956 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902958 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902961 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902964 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902966 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902969 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902973 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902976 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902978 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902981 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902984 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902987 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:20.903471 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902990 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902993 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902996 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.902999 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903002 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903005 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903009 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903013 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903016 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903019 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903022 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903025 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903027 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903030 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903033 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903035 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903038 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903042 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903047 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:20.903964 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903050 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903053 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903056 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903060 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903062 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903065 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903068 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903071 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903073 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903076 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903079 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903082 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903084 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903087 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903089 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903092 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903095 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903097 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903100 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:20.904460 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903102 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903105 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903107 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903110 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903113 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.903118 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903221 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903225 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903228 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903230 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903233 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903236 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903239 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903241 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903244 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:20.904942 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903246 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903249 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903252 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903255 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903258 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903261 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903263 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903266 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903268 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903271 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903274 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903277 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903279 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903282 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903284 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903287 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903290 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903292 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903295 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:20.905312 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903298 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903300 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903303 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903306 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903309 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903313 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903316 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903319 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903322 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903324 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903327 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903331 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903333 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903336 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903339 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903342 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903345 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903348 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903351 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:20.905877 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903353 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903356 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903359 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903361 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903364 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903366 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903368 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903371 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903373 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903376 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903378 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903381 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903384 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903386 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903389 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903391 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903393 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903397 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903400 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903402 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:20.906363 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903405 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903407 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903410 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903413 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903416 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903418 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903421 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903423 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903426 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903429 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903431 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903435 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903438 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903441 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903443 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903446 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903448 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903451 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:20.906868 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:20.903453 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:20.907320 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.903459 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:20.907320 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.904205 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:31:20.907320 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.906446 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:31:20.907519 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.907507 2576 server.go:1019] "Starting client certificate rotation" Apr 17 16:31:20.907620 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.907603 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:20.908298 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.908286 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:20.936025 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.936003 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:20.944685 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.944655 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:20.963659 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.963632 2576 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:31:20.964780 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.964761 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:20.969412 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.969399 2576 log.go:25] "Validated CRI v1 image API" Apr 17 16:31:20.970639 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.970623 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:31:20.973282 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.973259 2576 fs.go:135] Filesystem UUIDs: map[072aa211-7c5e-42ea-a168-3003de08ab2c:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a0bae9cc-0800-456d-b699-e8f2772bd618:/dev/nvme0n1p4] Apr 17 16:31:20.973337 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.973283 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:31:20.979397 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.979261 2576 manager.go:217] Machine: {Timestamp:2026-04-17 16:31:20.977105889 +0000 UTC m=+0.442288741 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3115676 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25b6b1fdc1d080523ffcf9602f066d SystemUUID:ec25b6b1-fdc1-d080-523f-fcf9602f066d BootID:5f0c8023-448d-463a-b698-8f3a1a7c1948 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bf:f2:34:e0:f5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bf:f2:34:e0:f5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d2:10:05:64:4c:43 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:31:20.979397 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.979394 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:31:20.979536 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.979523 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:31:20.981361 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.981339 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:31:20.981524 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.981364 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-142.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:31:20.981567 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.981533 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:31:20.981567 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.981542 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:31:20.981567 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.981555 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:20.982847 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.982837 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:20.984096 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.984085 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:20.984202 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.984193 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:31:20.987466 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.987452 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:31:20.987530 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.987470 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:31:20.987530 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.987486 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:31:20.987530 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.987509 2576 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:31:20.987530 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.987519 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:31:20.988811 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.988798 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:20.988848 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.988823 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:20.989111 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.989093 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g8ttj" Apr 17 16:31:20.993529 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.993489 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:31:20.994932 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.994913 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g8ttj" Apr 17 16:31:20.995137 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.995122 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:31:20.996569 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996556 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:31:20.996623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996573 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:31:20.996623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996580 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:31:20.996623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996585 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:31:20.996623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996591 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:31:20.996623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996597 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:31:20.996623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996603 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:31:20.996623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996609 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:31:20.996623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996617 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:31:20.996623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996622 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:31:20.996886 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996631 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:31:20.996886 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.996640 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:31:20.998357 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.998342 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:31:20.998357 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:20.998352 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:31:20.998512 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:20.998475 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:31:20.998548 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:20.998475 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:31:21.002210 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.002197 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:31:21.002268 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.002245 2576 server.go:1295] "Started kubelet" Apr 17 16:31:21.002334 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.002311 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:31:21.002376 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.002337 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:31:21.002409 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.002401 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:31:21.003124 ip-10-0-134-142 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:31:21.003622 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.003572 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:31:21.004344 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.004332 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:31:21.009842 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.009824 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:21.010450 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.010431 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:31:21.011077 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011056 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:31:21.011077 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011058 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:31:21.011235 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011084 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:31:21.011235 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011206 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:31:21.011235 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011214 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:31:21.011235 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011234 2576 factory.go:55] Registering systemd factory Apr 17 16:31:21.011397 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011249 2576 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:31:21.011445 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.011415 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.011603 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011584 2576 factory.go:153] Registering CRI-O factory Apr 17 16:31:21.011603 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011603 2576 factory.go:223] Registration of the crio container factory successfully Apr 17 16:31:21.011741 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011652 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:31:21.011741 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011674 2576 factory.go:103] Registering Raw factory Apr 17 16:31:21.011741 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.011690 2576 manager.go:1196] Started watching for new ooms in manager Apr 17 16:31:21.012270 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.012089 2576 manager.go:319] Starting recovery of all containers Apr 17 16:31:21.012664 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.012642 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:21.012911 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.012891 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:31:21.019919 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.019889 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-142.ec2.internal\" not found" node="ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.019997 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.019952 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-142.ec2.internal" not found Apr 17 16:31:21.026567 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.026549 2576 manager.go:324] Recovery completed Apr 17 16:31:21.031160 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.031145 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:21.033616 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.033601 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:21.033691 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.033625 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:21.033691 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.033635 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:21.034839 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.034328 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:31:21.034839 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.034341 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:31:21.034839 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.034360 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:21.036451 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.036433 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-142.ec2.internal" not found Apr 17 16:31:21.037016 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.037006 2576 policy_none.go:49] "None policy: Start" Apr 17 16:31:21.037052 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.037026 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:31:21.037052 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.037037 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:31:21.080079 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.073077 2576 manager.go:341] "Starting Device Plugin manager" Apr 17 16:31:21.080079 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.073111 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:31:21.080079 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.073123 2576 server.go:85] "Starting device plugin registration server" Apr 17 16:31:21.080079 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.073390 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:31:21.080079 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.073403 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:31:21.080079 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.073476 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:31:21.080079 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.073566 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:31:21.080079 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.073573 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:31:21.080079 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.074194 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:31:21.080079 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.074235 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.096603 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.096581 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-142.ec2.internal" not found Apr 17 16:31:21.136253 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.136213 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:31:21.137558 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.137513 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:31:21.137558 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.137536 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:31:21.137558 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.137554 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:31:21.137558 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.137560 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:31:21.137749 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.137599 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:31:21.140092 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.140073 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:21.174204 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.174164 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:21.175120 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.175104 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:21.175232 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.175140 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:21.175232 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.175155 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:21.175232 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.175188 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.184960 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.184936 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.185065 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.184963 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-142.ec2.internal\": node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.202272 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.202250 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.238575 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.238536 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal"] Apr 17 16:31:21.238648 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.238623 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:21.240182 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.240166 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:21.240261 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.240195 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:21.240261 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.240216 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:21.241467 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.241455 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:21.241655 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.241641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.241720 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.241668 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:21.242239 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.242222 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:21.242315 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.242246 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:21.242315 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.242267 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:21.242315 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.242222 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:21.242454 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.242342 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:21.242454 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.242371 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:21.243592 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.243577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.243680 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.243605 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:21.244271 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.244255 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:21.244371 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.244284 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:21.244371 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.244295 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:21.265059 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.265039 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-142.ec2.internal\" not found" node="ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.268459 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.268441 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-142.ec2.internal\" not found" node="ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.303062 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.303034 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.313405 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.313382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/56aaddce41360941fa905aefab9b7fdd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal\" (UID: \"56aaddce41360941fa905aefab9b7fdd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.313531 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.313408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/56aaddce41360941fa905aefab9b7fdd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal\" (UID: \"56aaddce41360941fa905aefab9b7fdd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.313531 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.313426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ee3f40a240ec9a107cf332537877284-config\") pod \"kube-apiserver-proxy-ip-10-0-134-142.ec2.internal\" (UID: \"0ee3f40a240ec9a107cf332537877284\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.404221 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.404142 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.413819 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.413794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/56aaddce41360941fa905aefab9b7fdd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal\" (UID: \"56aaddce41360941fa905aefab9b7fdd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.413917 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.413830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/56aaddce41360941fa905aefab9b7fdd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal\" (UID: \"56aaddce41360941fa905aefab9b7fdd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.413917 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.413854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ee3f40a240ec9a107cf332537877284-config\") pod \"kube-apiserver-proxy-ip-10-0-134-142.ec2.internal\" (UID: \"0ee3f40a240ec9a107cf332537877284\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.413917 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.413898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/56aaddce41360941fa905aefab9b7fdd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal\" (UID: \"56aaddce41360941fa905aefab9b7fdd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.414049 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.413922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/56aaddce41360941fa905aefab9b7fdd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal\" (UID: \"56aaddce41360941fa905aefab9b7fdd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.414049 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.413908 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ee3f40a240ec9a107cf332537877284-config\") pod \"kube-apiserver-proxy-ip-10-0-134-142.ec2.internal\" (UID: \"0ee3f40a240ec9a107cf332537877284\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.504447 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.504421 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.567909 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.567879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.571571 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.571551 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal" Apr 17 16:31:21.605487 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.605459 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.706052 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.705969 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.806474 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.806443 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.906877 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:21.906850 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:21.907954 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.907936 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:31:21.908111 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.908095 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:21.908111 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.908107 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:21.996965 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.996861 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:26:20 +0000 UTC" deadline="2027-09-29 07:25:59.980679528 +0000 UTC" Apr 17 16:31:21.996965 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:21.996914 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12710h54m37.983768276s" Apr 17 16:31:22.006963 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:22.006937 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:22.010086 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.010073 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:22.031903 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.031878 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:22.051623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.051595 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h5wkc" Apr 17 16:31:22.052011 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.051996 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:22.059880 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.059862 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h5wkc" Apr 17 16:31:22.107439 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:22.107406 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:22.198994 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:22.198952 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee3f40a240ec9a107cf332537877284.slice/crio-0c52f4b77f4014ad564156b8ff441bf007ab36f790836583da9e98c7e56d5870 WatchSource:0}: Error finding container 0c52f4b77f4014ad564156b8ff441bf007ab36f790836583da9e98c7e56d5870: Status 404 returned error can't find the container with id 0c52f4b77f4014ad564156b8ff441bf007ab36f790836583da9e98c7e56d5870 Apr 17 16:31:22.199436 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:22.199414 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56aaddce41360941fa905aefab9b7fdd.slice/crio-74d6fdb7582e1843753189a4641aa9677e9b4e3ddae61356a3b5f1385fffa795 WatchSource:0}: Error finding container 74d6fdb7582e1843753189a4641aa9677e9b4e3ddae61356a3b5f1385fffa795: Status 404 returned error can't find the container with id 74d6fdb7582e1843753189a4641aa9677e9b4e3ddae61356a3b5f1385fffa795 Apr 17 16:31:22.205335 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.205320 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:22.207844 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:22.207764 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:22.308356 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:22.308254 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:22.408765 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:22.408730 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-142.ec2.internal\" not found" Apr 17 16:31:22.495278 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.495247 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:22.511173 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.511154 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" Apr 17 16:31:22.526282 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.526258 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:22.527282 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.527270 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal" Apr 17 16:31:22.536656 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.536637 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:22.800890 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.800800 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:22.989170 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.989141 2576 apiserver.go:52] "Watching apiserver" Apr 17 16:31:22.996621 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.996594 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:31:22.996997 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.996972 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2","openshift-cluster-node-tuning-operator/tuned-lnmnp","openshift-dns/node-resolver-2wn98","openshift-multus/multus-additional-cni-plugins-jkkg4","openshift-multus/network-metrics-daemon-jn9vc","openshift-network-operator/iptables-alerter-92jvc","openshift-image-registry/node-ca-8mrzt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal","openshift-multus/multus-52zsq","openshift-network-diagnostics/network-check-target-w2dvt","openshift-ovn-kubernetes/ovnkube-node-twl5m","kube-system/konnectivity-agent-rqqd6","kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal"] Apr 17 16:31:22.999959 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.999937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.000064 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:22.999966 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.001001 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.000984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.002270 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.002249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.003595 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.003566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:23.003682 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.003658 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:23.003864 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.003835 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:31:23.003864 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.003859 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:31:23.004004 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.003838 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:23.004004 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.003978 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6pmsf\"" Apr 17 16:31:23.004084 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.003984 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-llzzq\"" Apr 17 16:31:23.004209 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.004194 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8lg5q\"" Apr 17 16:31:23.004282 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.004224 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:31:23.005170 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.004906 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:31:23.005170 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.004910 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:23.005170 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.005014 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:31:23.005170 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.005041 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:31:23.005170 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.005062 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zwp8b\"" Apr 17 16:31:23.005170 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.005153 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:31:23.005447 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.005180 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:31:23.005447 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.005396 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:31:23.005447 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.005429 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:31:23.006576 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.006452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.007622 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.007607 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.008820 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.008726 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:31:23.008820 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.008755 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-csnw6\"" Apr 17 16:31:23.008820 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.008773 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:23.009021 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.008843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.009021 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.008726 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:23.009604 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.009587 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:31:23.009694 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.009677 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:31:23.009983 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.009968 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:31:23.010046 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.010022 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lv9n7\"" Apr 17 16:31:23.010141 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.010122 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:23.010218 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.010197 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:23.010974 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.010420 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.011146 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.011121 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:31:23.012163 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.012117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:23.013463 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.012286 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d5k84\"" Apr 17 16:31:23.013463 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.012899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:31:23.013463 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.013166 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:31:23.014193 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.014174 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:31:23.014910 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.014892 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:31:23.014910 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.014900 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-whnrn\"" Apr 17 16:31:23.015145 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.015115 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:31:23.015228 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.015174 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:31:23.015228 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.015188 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:31:23.015228 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.015176 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:31:23.015379 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.015300 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7kkll\"" Apr 17 16:31:23.015379 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.015333 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:31:23.023325 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023304 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-run-netns\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.023424 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.023424 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f000c2d-d989-4f8a-8182-b5104c3a3de6-serviceca\") pod \"node-ca-8mrzt\" (UID: \"4f000c2d-d989-4f8a-8182-b5104c3a3de6\") " pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.023424 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-lib-modules\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.023424 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-var-lib-kubelet\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.023619 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-tuned\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.023619 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4pnq\" (UniqueName: \"kubernetes.io/projected/381620fe-22c1-43ed-99de-f61c1b4886ca-kube-api-access-g4pnq\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:23.023619 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-cnibin\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.023619 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72324db7-6720-4fe7-a0fc-23dee216601e-cni-binary-copy\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.023619 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7ckd\" (UniqueName: \"kubernetes.io/projected/72324db7-6720-4fe7-a0fc-23dee216601e-kube-api-access-m7ckd\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.023804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-registration-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.023804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023673 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-etc-selinux\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.023804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023699 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-hostroot\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.023804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-run-systemd\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.023804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-etc-openvswitch\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.023804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023770 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-run-ovn\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.023804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-log-socket\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-run\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023830 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsq5w\" (UniqueName: \"kubernetes.io/projected/804a35f0-09a6-45b7-b583-0d7f038e28f8-kube-api-access-lsq5w\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-kubelet\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21d6e76d-ca83-45a0-b330-7eced4280d5f-ovnkube-config\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d27c\" (UniqueName: \"kubernetes.io/projected/b6fd1c6d-2527-488b-ba57-022c924d53df-kube-api-access-8d27c\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-system-cni-dir\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28b3abb3-0913-43c2-a633-54a17d3f88b8-iptables-alerter-script\") pod \"iptables-alerter-92jvc\" (UID: \"28b3abb3-0913-43c2-a633-54a17d3f88b8\") " pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-node-log\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zcjr\" (UniqueName: \"kubernetes.io/projected/4f000c2d-d989-4f8a-8182-b5104c3a3de6-kube-api-access-2zcjr\") pod \"node-ca-8mrzt\" (UID: \"4f000c2d-d989-4f8a-8182-b5104c3a3de6\") " pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.023985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-var-lib-cni-multus\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/03ed97e0-9a98-4b06-9c36-752ba631bdc9-hosts-file\") pod \"node-resolver-2wn98\" (UID: \"03ed97e0-9a98-4b06-9c36-752ba631bdc9\") " pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28b3abb3-0913-43c2-a633-54a17d3f88b8-host-slash\") pod \"iptables-alerter-92jvc\" (UID: \"28b3abb3-0913-43c2-a633-54a17d3f88b8\") " pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.024067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024070 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-daemon-config\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024092 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-cni-bin\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f000c2d-d989-4f8a-8182-b5104c3a3de6-host\") pod \"node-ca-8mrzt\" (UID: \"4f000c2d-d989-4f8a-8182-b5104c3a3de6\") " pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6svm9\" (UniqueName: \"kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9\") pod \"network-check-target-w2dvt\" (UID: \"a93503d7-3f06-47f2-95b0-7c4d48999d28\") " pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024137 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-var-lib-cni-bin\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-systemd-units\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-cni-netd\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-sys\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-kubelet-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-system-cni-dir\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd3229a9-ef02-4e0e-aa80-b65af9672fee-cni-binary-copy\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-run-k8s-cni-cncf-io\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-var-lib-openvswitch\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-run-openvswitch\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21d6e76d-ca83-45a0-b330-7eced4280d5f-ovnkube-script-lib\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-etc-kubernetes\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024460 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/804a35f0-09a6-45b7-b583-0d7f038e28f8-tmp\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.024552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-socket-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/72324db7-6720-4fe7-a0fc-23dee216601e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024559 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-cni-dir\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-socket-dir-parent\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-var-lib-kubelet\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024610 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-conf-dir\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmrtj\" (UniqueName: \"kubernetes.io/projected/bd3229a9-ef02-4e0e-aa80-b65af9672fee-kube-api-access-zmrtj\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-sysconfig\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-sysctl-d\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-sysctl-conf\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-host\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-sys-fs\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/72324db7-6720-4fe7-a0fc-23dee216601e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21d6e76d-ca83-45a0-b330-7eced4280d5f-env-overrides\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w9zg\" (UniqueName: \"kubernetes.io/projected/21d6e76d-ca83-45a0-b330-7eced4280d5f-kube-api-access-2w9zg\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-cnibin\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.025218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21d6e76d-ca83-45a0-b330-7eced4280d5f-ovn-node-metrics-cert\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-kubernetes\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024957 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p87j\" (UniqueName: \"kubernetes.io/projected/03ed97e0-9a98-4b06-9c36-752ba631bdc9-kube-api-access-2p87j\") pod \"node-resolver-2wn98\" (UID: \"03ed97e0-9a98-4b06-9c36-752ba631bdc9\") " pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-run-multus-certs\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.024999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-slash\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/72c7f3b9-a592-4131-9cee-a95f18f9eaa3-konnectivity-ca\") pod \"konnectivity-agent-rqqd6\" (UID: \"72c7f3b9-a592-4131-9cee-a95f18f9eaa3\") " pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-systemd\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-os-release\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-run-netns\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/72c7f3b9-a592-4131-9cee-a95f18f9eaa3-agent-certs\") pod \"konnectivity-agent-rqqd6\" (UID: \"72c7f3b9-a592-4131-9cee-a95f18f9eaa3\") " pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03ed97e0-9a98-4b06-9c36-752ba631bdc9-tmp-dir\") pod \"node-resolver-2wn98\" (UID: \"03ed97e0-9a98-4b06-9c36-752ba631bdc9\") " pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-device-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57vxj\" (UniqueName: \"kubernetes.io/projected/28b3abb3-0913-43c2-a633-54a17d3f88b8-kube-api-access-57vxj\") pod \"iptables-alerter-92jvc\" (UID: \"28b3abb3-0913-43c2-a633-54a17d3f88b8\") " pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.025752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-modprobe-d\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.026184 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.025337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-os-release\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.062023 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.061995 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:22 +0000 UTC" deadline="2028-01-27 09:50:44.713048702 +0000 UTC" Apr 17 16:31:23.062023 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.062019 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15593h19m21.651031488s" Apr 17 16:31:23.126395 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-var-lib-cni-multus\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.126395 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/03ed97e0-9a98-4b06-9c36-752ba631bdc9-hosts-file\") pod \"node-resolver-2wn98\" (UID: \"03ed97e0-9a98-4b06-9c36-752ba631bdc9\") " pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.126656 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126430 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28b3abb3-0913-43c2-a633-54a17d3f88b8-host-slash\") pod \"iptables-alerter-92jvc\" (UID: \"28b3abb3-0913-43c2-a633-54a17d3f88b8\") " pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.126656 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-daemon-config\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.126656 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-var-lib-cni-multus\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.126656 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-cni-bin\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.126656 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f000c2d-d989-4f8a-8182-b5104c3a3de6-host\") pod \"node-ca-8mrzt\" (UID: \"4f000c2d-d989-4f8a-8182-b5104c3a3de6\") " pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.126656 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6svm9\" (UniqueName: \"kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9\") pod \"network-check-target-w2dvt\" (UID: \"a93503d7-3f06-47f2-95b0-7c4d48999d28\") " pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:23.126656 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-var-lib-cni-bin\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-systemd-units\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126680 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f000c2d-d989-4f8a-8182-b5104c3a3de6-host\") pod \"node-ca-8mrzt\" (UID: \"4f000c2d-d989-4f8a-8182-b5104c3a3de6\") " pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-cni-netd\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-sys\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-var-lib-cni-bin\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-systemd-units\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-kubelet-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-cni-netd\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-kubelet-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-system-cni-dir\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd3229a9-ef02-4e0e-aa80-b65af9672fee-cni-binary-copy\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.126525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/03ed97e0-9a98-4b06-9c36-752ba631bdc9-hosts-file\") pod \"node-resolver-2wn98\" (UID: \"03ed97e0-9a98-4b06-9c36-752ba631bdc9\") " pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-run-k8s-cni-cncf-io\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.127053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-cni-bin\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-var-lib-openvswitch\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-run-k8s-cni-cncf-io\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-run-openvswitch\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-run-openvswitch\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-system-cni-dir\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-var-lib-openvswitch\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28b3abb3-0913-43c2-a633-54a17d3f88b8-host-slash\") pod \"iptables-alerter-92jvc\" (UID: \"28b3abb3-0913-43c2-a633-54a17d3f88b8\") " pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127296 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21d6e76d-ca83-45a0-b330-7eced4280d5f-ovnkube-script-lib\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-etc-kubernetes\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/804a35f0-09a6-45b7-b583-0d7f038e28f8-tmp\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-socket-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/72324db7-6720-4fe7-a0fc-23dee216601e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-etc-kubernetes\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.127669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.127481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-sys\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.128225 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.128112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-socket-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.128640 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.128296 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:31:23.128640 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.128359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-cni-dir\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.128878 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.128792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-daemon-config\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.128878 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.128828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd3229a9-ef02-4e0e-aa80-b65af9672fee-cni-binary-copy\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129006 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.128876 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21d6e76d-ca83-45a0-b330-7eced4280d5f-ovnkube-script-lib\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.129073 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-socket-dir-parent\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129073 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-cni-dir\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129073 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-var-lib-kubelet\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129214 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-conf-dir\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129214 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-socket-dir-parent\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129214 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmrtj\" (UniqueName: \"kubernetes.io/projected/bd3229a9-ef02-4e0e-aa80-b65af9672fee-kube-api-access-zmrtj\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129214 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-sysconfig\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.129214 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-var-lib-kubelet\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129214 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-sysconfig\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.129462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-multus-conf-dir\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-sysctl-d\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.129462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-sysctl-conf\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.129462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129373 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/72324db7-6720-4fe7-a0fc-23dee216601e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.129462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-host\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.129462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-sys-fs\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.129462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/72324db7-6720-4fe7-a0fc-23dee216601e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21d6e76d-ca83-45a0-b330-7eced4280d5f-env-overrides\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w9zg\" (UniqueName: \"kubernetes.io/projected/21d6e76d-ca83-45a0-b330-7eced4280d5f-kube-api-access-2w9zg\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129539 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-sysctl-conf\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-cnibin\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-sys-fs\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21d6e76d-ca83-45a0-b330-7eced4280d5f-ovn-node-metrics-cert\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-kubernetes\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p87j\" (UniqueName: \"kubernetes.io/projected/03ed97e0-9a98-4b06-9c36-752ba631bdc9-kube-api-access-2p87j\") pod \"node-resolver-2wn98\" (UID: \"03ed97e0-9a98-4b06-9c36-752ba631bdc9\") " pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-run-multus-certs\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-slash\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-sysctl-d\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.129815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/72c7f3b9-a592-4131-9cee-a95f18f9eaa3-konnectivity-ca\") pod \"konnectivity-agent-rqqd6\" (UID: \"72c7f3b9-a592-4131-9cee-a95f18f9eaa3\") " pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:23.130294 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.129845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-cnibin\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.130294 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21d6e76d-ca83-45a0-b330-7eced4280d5f-env-overrides\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.130294 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/72c7f3b9-a592-4131-9cee-a95f18f9eaa3-konnectivity-ca\") pod \"konnectivity-agent-rqqd6\" (UID: \"72c7f3b9-a592-4131-9cee-a95f18f9eaa3\") " pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:23.130420 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-kubernetes\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.130420 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-slash\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.130522 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-run-multus-certs\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.130577 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-systemd\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.130624 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:23.130669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-os-release\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.130669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.130754 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130688 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/72324db7-6720-4fe7-a0fc-23dee216601e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.130754 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-run-netns\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.130754 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/72c7f3b9-a592-4131-9cee-a95f18f9eaa3-agent-certs\") pod \"konnectivity-agent-rqqd6\" (UID: \"72c7f3b9-a592-4131-9cee-a95f18f9eaa3\") " pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:23.130876 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03ed97e0-9a98-4b06-9c36-752ba631bdc9-tmp-dir\") pod \"node-resolver-2wn98\" (UID: \"03ed97e0-9a98-4b06-9c36-752ba631bdc9\") " pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.130876 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-device-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.130876 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57vxj\" (UniqueName: \"kubernetes.io/projected/28b3abb3-0913-43c2-a633-54a17d3f88b8-kube-api-access-57vxj\") pod \"iptables-alerter-92jvc\" (UID: \"28b3abb3-0913-43c2-a633-54a17d3f88b8\") " pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.130876 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.131044 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-modprobe-d\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.131044 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-os-release\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.131044 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-run-netns\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.131044 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.130986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.131044 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f000c2d-d989-4f8a-8182-b5104c3a3de6-serviceca\") pod \"node-ca-8mrzt\" (UID: \"4f000c2d-d989-4f8a-8182-b5104c3a3de6\") " pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.131258 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-lib-modules\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.131258 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-var-lib-kubelet\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.131258 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-tuned\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.131258 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4pnq\" (UniqueName: \"kubernetes.io/projected/381620fe-22c1-43ed-99de-f61c1b4886ca-kube-api-access-g4pnq\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:23.131258 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-cnibin\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.131258 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72324db7-6720-4fe7-a0fc-23dee216601e-cni-binary-copy\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.131258 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7ckd\" (UniqueName: \"kubernetes.io/projected/72324db7-6720-4fe7-a0fc-23dee216601e-kube-api-access-m7ckd\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.131579 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-os-release\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.131579 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-registration-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.131579 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-registration-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.131579 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.131579 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-modprobe-d\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.131579 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-os-release\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.131868 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-systemd\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.131868 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03ed97e0-9a98-4b06-9c36-752ba631bdc9-tmp-dir\") pod \"node-resolver-2wn98\" (UID: \"03ed97e0-9a98-4b06-9c36-752ba631bdc9\") " pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.131868 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.131699 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:23.131868 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-etc-selinux\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.131868 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-hostroot\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.131868 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.131793 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs podName:381620fe-22c1-43ed-99de-f61c1b4886ca nodeName:}" failed. No retries permitted until 2026-04-17 16:31:23.631755149 +0000 UTC m=+3.096938005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs") pod "network-metrics-daemon-jn9vc" (UID: "381620fe-22c1-43ed-99de-f61c1b4886ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:23.131868 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-hostroot\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.131868 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-run-systemd\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.131868 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-etc-openvswitch\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-run-ovn\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-log-socket\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-run\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.131982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsq5w\" (UniqueName: \"kubernetes.io/projected/804a35f0-09a6-45b7-b583-0d7f038e28f8-kube-api-access-lsq5w\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-kubelet\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21d6e76d-ca83-45a0-b330-7eced4280d5f-ovnkube-config\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d27c\" (UniqueName: \"kubernetes.io/projected/b6fd1c6d-2527-488b-ba57-022c924d53df-kube-api-access-8d27c\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/804a35f0-09a6-45b7-b583-0d7f038e28f8-tmp\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-system-cni-dir\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd3229a9-ef02-4e0e-aa80-b65af9672fee-host-run-netns\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-run-netns\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28b3abb3-0913-43c2-a633-54a17d3f88b8-iptables-alerter-script\") pod \"iptables-alerter-92jvc\" (UID: \"28b3abb3-0913-43c2-a633-54a17d3f88b8\") " pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-node-log\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-etc-selinux\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zcjr\" (UniqueName: \"kubernetes.io/projected/4f000c2d-d989-4f8a-8182-b5104c3a3de6-kube-api-access-2zcjr\") pod \"node-ca-8mrzt\" (UID: \"4f000c2d-d989-4f8a-8182-b5104c3a3de6\") " pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.132263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-run-ovn\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21d6e76d-ca83-45a0-b330-7eced4280d5f-ovn-node-metrics-cert\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-host-kubelet\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f000c2d-d989-4f8a-8182-b5104c3a3de6-serviceca\") pod \"node-ca-8mrzt\" (UID: \"4f000c2d-d989-4f8a-8182-b5104c3a3de6\") " pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.132975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-var-lib-kubelet\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.132975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-log-socket\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.132975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-cnibin\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.132975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.132831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28b3abb3-0913-43c2-a633-54a17d3f88b8-iptables-alerter-script\") pod \"iptables-alerter-92jvc\" (UID: \"28b3abb3-0913-43c2-a633-54a17d3f88b8\") " pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.133312 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.133071 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:23.133312 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.133090 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:23.133312 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.133109 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6svm9 for pod openshift-network-diagnostics/network-check-target-w2dvt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:23.133312 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.133118 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-run-systemd\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.133312 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.133174 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9 podName:a93503d7-3f06-47f2-95b0-7c4d48999d28 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:23.633152062 +0000 UTC m=+3.098334914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6svm9" (UniqueName: "kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9") pod "network-check-target-w2dvt" (UID: "a93503d7-3f06-47f2-95b0-7c4d48999d28") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:23.133312 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.133069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72324db7-6720-4fe7-a0fc-23dee216601e-cni-binary-copy\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.134365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.133379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b6fd1c6d-2527-488b-ba57-022c924d53df-device-dir\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.134365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.133435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72324db7-6720-4fe7-a0fc-23dee216601e-system-cni-dir\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.134365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.133534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-run\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.134365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.133558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-etc-openvswitch\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.134365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.133587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-lib-modules\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.134365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.133598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21d6e76d-ca83-45a0-b330-7eced4280d5f-ovnkube-config\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.134365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.133665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21d6e76d-ca83-45a0-b330-7eced4280d5f-node-log\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.134365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.133672 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/804a35f0-09a6-45b7-b583-0d7f038e28f8-host\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.135253 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.135226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/804a35f0-09a6-45b7-b583-0d7f038e28f8-etc-tuned\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.136274 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.136250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/72c7f3b9-a592-4131-9cee-a95f18f9eaa3-agent-certs\") pod \"konnectivity-agent-rqqd6\" (UID: \"72c7f3b9-a592-4131-9cee-a95f18f9eaa3\") " pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:23.137017 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.136997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmrtj\" (UniqueName: \"kubernetes.io/projected/bd3229a9-ef02-4e0e-aa80-b65af9672fee-kube-api-access-zmrtj\") pod \"multus-52zsq\" (UID: \"bd3229a9-ef02-4e0e-aa80-b65af9672fee\") " pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.137440 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.137417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w9zg\" (UniqueName: \"kubernetes.io/projected/21d6e76d-ca83-45a0-b330-7eced4280d5f-kube-api-access-2w9zg\") pod \"ovnkube-node-twl5m\" (UID: \"21d6e76d-ca83-45a0-b330-7eced4280d5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.138091 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.138071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p87j\" (UniqueName: \"kubernetes.io/projected/03ed97e0-9a98-4b06-9c36-752ba631bdc9-kube-api-access-2p87j\") pod \"node-resolver-2wn98\" (UID: \"03ed97e0-9a98-4b06-9c36-752ba631bdc9\") " pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.141036 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.140836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7ckd\" (UniqueName: \"kubernetes.io/projected/72324db7-6720-4fe7-a0fc-23dee216601e-kube-api-access-m7ckd\") pod \"multus-additional-cni-plugins-jkkg4\" (UID: \"72324db7-6720-4fe7-a0fc-23dee216601e\") " pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.141266 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.141059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zcjr\" (UniqueName: \"kubernetes.io/projected/4f000c2d-d989-4f8a-8182-b5104c3a3de6-kube-api-access-2zcjr\") pod \"node-ca-8mrzt\" (UID: \"4f000c2d-d989-4f8a-8182-b5104c3a3de6\") " pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.141440 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.141417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d27c\" (UniqueName: \"kubernetes.io/projected/b6fd1c6d-2527-488b-ba57-022c924d53df-kube-api-access-8d27c\") pod \"aws-ebs-csi-driver-node-skkh2\" (UID: \"b6fd1c6d-2527-488b-ba57-022c924d53df\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.141975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.141929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" event={"ID":"56aaddce41360941fa905aefab9b7fdd","Type":"ContainerStarted","Data":"74d6fdb7582e1843753189a4641aa9677e9b4e3ddae61356a3b5f1385fffa795"} Apr 17 16:31:23.143076 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.143056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4pnq\" (UniqueName: \"kubernetes.io/projected/381620fe-22c1-43ed-99de-f61c1b4886ca-kube-api-access-g4pnq\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:23.143162 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.143119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57vxj\" (UniqueName: \"kubernetes.io/projected/28b3abb3-0913-43c2-a633-54a17d3f88b8-kube-api-access-57vxj\") pod \"iptables-alerter-92jvc\" (UID: \"28b3abb3-0913-43c2-a633-54a17d3f88b8\") " pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.143345 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.143308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsq5w\" (UniqueName: \"kubernetes.io/projected/804a35f0-09a6-45b7-b583-0d7f038e28f8-kube-api-access-lsq5w\") pod \"tuned-lnmnp\" (UID: \"804a35f0-09a6-45b7-b583-0d7f038e28f8\") " pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.143772 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.143746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal" event={"ID":"0ee3f40a240ec9a107cf332537877284","Type":"ContainerStarted","Data":"0c52f4b77f4014ad564156b8ff441bf007ab36f790836583da9e98c7e56d5870"} Apr 17 16:31:23.311243 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.311158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8mrzt" Apr 17 16:31:23.320041 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.320012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" Apr 17 16:31:23.326550 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.326530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2wn98" Apr 17 16:31:23.331112 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.331086 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" Apr 17 16:31:23.337789 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.337772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-92jvc" Apr 17 16:31:23.344352 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.344336 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" Apr 17 16:31:23.350930 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.350913 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-52zsq" Apr 17 16:31:23.356556 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.356535 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:23.362116 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.362091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:23.465751 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.465724 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:23.635761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.635681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:23.635761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:23.635740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6svm9\" (UniqueName: \"kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9\") pod \"network-check-target-w2dvt\" (UID: \"a93503d7-3f06-47f2-95b0-7c4d48999d28\") " pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:23.635972 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.635855 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:23.635972 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.635890 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:23.635972 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.635909 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:23.635972 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.635920 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6svm9 for pod openshift-network-diagnostics/network-check-target-w2dvt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:23.635972 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.635939 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs podName:381620fe-22c1-43ed-99de-f61c1b4886ca nodeName:}" failed. No retries permitted until 2026-04-17 16:31:24.635914635 +0000 UTC m=+4.101097480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs") pod "network-metrics-daemon-jn9vc" (UID: "381620fe-22c1-43ed-99de-f61c1b4886ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:23.635972 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:23.635969 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9 podName:a93503d7-3f06-47f2-95b0-7c4d48999d28 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:24.635952831 +0000 UTC m=+4.101135686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6svm9" (UniqueName: "kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9") pod "network-check-target-w2dvt" (UID: "a93503d7-3f06-47f2-95b0-7c4d48999d28") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:23.838473 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:23.838442 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804a35f0_09a6_45b7_b583_0d7f038e28f8.slice/crio-d6a048fdbfaaeba56c27531143c749221784677a0fe6daf4fe597a43dce44e70 WatchSource:0}: Error finding container d6a048fdbfaaeba56c27531143c749221784677a0fe6daf4fe597a43dce44e70: Status 404 returned error can't find the container with id d6a048fdbfaaeba56c27531143c749221784677a0fe6daf4fe597a43dce44e70 Apr 17 16:31:23.840520 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:23.840261 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f000c2d_d989_4f8a_8182_b5104c3a3de6.slice/crio-a3c252d2f6301bea41256361cee30a3d0445c30d7bf484d577241e00b907d1eb WatchSource:0}: Error finding container a3c252d2f6301bea41256361cee30a3d0445c30d7bf484d577241e00b907d1eb: Status 404 returned error can't find the container with id a3c252d2f6301bea41256361cee30a3d0445c30d7bf484d577241e00b907d1eb Apr 17 16:31:23.849239 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:23.849213 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6fd1c6d_2527_488b_ba57_022c924d53df.slice/crio-fcffdfbc9efbcf47fe1a76afecd22225bdb6152a19b95d211eefe4c55a6cd3c9 WatchSource:0}: Error finding container fcffdfbc9efbcf47fe1a76afecd22225bdb6152a19b95d211eefe4c55a6cd3c9: Status 404 returned error can't find the container with id fcffdfbc9efbcf47fe1a76afecd22225bdb6152a19b95d211eefe4c55a6cd3c9 Apr 17 16:31:23.849681 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:23.849619 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21d6e76d_ca83_45a0_b330_7eced4280d5f.slice/crio-2486ca3930734906caef67308a671c8d7e2f24ef79474b361a1996c569323963 WatchSource:0}: Error finding container 2486ca3930734906caef67308a671c8d7e2f24ef79474b361a1996c569323963: Status 404 returned error can't find the container with id 2486ca3930734906caef67308a671c8d7e2f24ef79474b361a1996c569323963 Apr 17 16:31:23.850558 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:23.850294 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72324db7_6720_4fe7_a0fc_23dee216601e.slice/crio-dad9aaa795dbf4656ea7ca62c564999751f837559073bab243308b07dc331a1d WatchSource:0}: Error finding container dad9aaa795dbf4656ea7ca62c564999751f837559073bab243308b07dc331a1d: Status 404 returned error can't find the container with id dad9aaa795dbf4656ea7ca62c564999751f837559073bab243308b07dc331a1d Apr 17 16:31:23.851075 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:31:23.851050 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ed97e0_9a98_4b06_9c36_752ba631bdc9.slice/crio-2c4e2abbad2634852c661d412e26ed20f8808463e590c5ee092939ff62aa6741 WatchSource:0}: Error finding container 2c4e2abbad2634852c661d412e26ed20f8808463e590c5ee092939ff62aa6741: Status 404 returned error can't find the container with id 2c4e2abbad2634852c661d412e26ed20f8808463e590c5ee092939ff62aa6741 Apr 17 16:31:24.062887 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.062849 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:22 +0000 UTC" deadline="2027-10-20 13:10:36.634239939 +0000 UTC" Apr 17 16:31:24.062887 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.062881 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13220h39m12.571361476s" Apr 17 16:31:24.138360 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.138330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:24.138645 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:24.138468 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:24.146273 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.146201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rqqd6" event={"ID":"72c7f3b9-a592-4131-9cee-a95f18f9eaa3","Type":"ContainerStarted","Data":"c3d39aa6a1806c2f3ad4942bf7eb9b5ec428c41e869cdfd422e31b94749a42be"} Apr 17 16:31:24.147546 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.147523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-52zsq" event={"ID":"bd3229a9-ef02-4e0e-aa80-b65af9672fee","Type":"ContainerStarted","Data":"6676199db74631a91689c496bcd9a9d8f076daf8d30e1f2b31b2c349eab45562"} Apr 17 16:31:24.148652 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.148619 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8mrzt" event={"ID":"4f000c2d-d989-4f8a-8182-b5104c3a3de6","Type":"ContainerStarted","Data":"a3c252d2f6301bea41256361cee30a3d0445c30d7bf484d577241e00b907d1eb"} Apr 17 16:31:24.149766 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.149744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" event={"ID":"804a35f0-09a6-45b7-b583-0d7f038e28f8","Type":"ContainerStarted","Data":"d6a048fdbfaaeba56c27531143c749221784677a0fe6daf4fe597a43dce44e70"} Apr 17 16:31:24.150831 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.150804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2wn98" event={"ID":"03ed97e0-9a98-4b06-9c36-752ba631bdc9","Type":"ContainerStarted","Data":"2c4e2abbad2634852c661d412e26ed20f8808463e590c5ee092939ff62aa6741"} Apr 17 16:31:24.151644 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.151623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" event={"ID":"21d6e76d-ca83-45a0-b330-7eced4280d5f","Type":"ContainerStarted","Data":"2486ca3930734906caef67308a671c8d7e2f24ef79474b361a1996c569323963"} Apr 17 16:31:24.152461 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.152443 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-92jvc" event={"ID":"28b3abb3-0913-43c2-a633-54a17d3f88b8","Type":"ContainerStarted","Data":"ad8ec18aeace6640248e8097fab56c9b4645f79279764a923fcdf9c72964a050"} Apr 17 16:31:24.153790 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.153771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal" event={"ID":"0ee3f40a240ec9a107cf332537877284","Type":"ContainerStarted","Data":"e672c6c206f191f0780cdc548b26f2866ee9cda8498b344373a73ee3f9a4664f"} Apr 17 16:31:24.154721 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.154703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" event={"ID":"72324db7-6720-4fe7-a0fc-23dee216601e","Type":"ContainerStarted","Data":"dad9aaa795dbf4656ea7ca62c564999751f837559073bab243308b07dc331a1d"} Apr 17 16:31:24.156253 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.156230 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" event={"ID":"b6fd1c6d-2527-488b-ba57-022c924d53df","Type":"ContainerStarted","Data":"fcffdfbc9efbcf47fe1a76afecd22225bdb6152a19b95d211eefe4c55a6cd3c9"} Apr 17 16:31:24.167469 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.167430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-142.ec2.internal" podStartSLOduration=2.1674199 podStartE2EDuration="2.1674199s" podCreationTimestamp="2026-04-17 16:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:24.167295957 +0000 UTC m=+3.632478819" watchObservedRunningTime="2026-04-17 16:31:24.1674199 +0000 UTC m=+3.632602760" Apr 17 16:31:24.646914 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.646048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:24.646914 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:24.646124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6svm9\" (UniqueName: \"kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9\") pod \"network-check-target-w2dvt\" (UID: \"a93503d7-3f06-47f2-95b0-7c4d48999d28\") " pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:24.646914 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:24.646268 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:24.646914 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:24.646285 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:24.646914 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:24.646298 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6svm9 for pod openshift-network-diagnostics/network-check-target-w2dvt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:24.646914 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:24.646355 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9 podName:a93503d7-3f06-47f2-95b0-7c4d48999d28 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:26.646337623 +0000 UTC m=+6.111520485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6svm9" (UniqueName: "kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9") pod "network-check-target-w2dvt" (UID: "a93503d7-3f06-47f2-95b0-7c4d48999d28") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:24.646914 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:24.646781 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:24.646914 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:24.646830 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs podName:381620fe-22c1-43ed-99de-f61c1b4886ca nodeName:}" failed. No retries permitted until 2026-04-17 16:31:26.646814854 +0000 UTC m=+6.111997695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs") pod "network-metrics-daemon-jn9vc" (UID: "381620fe-22c1-43ed-99de-f61c1b4886ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:25.140790 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:25.140712 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:25.141282 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:25.140830 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:25.177155 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:25.177110 2576 generic.go:358] "Generic (PLEG): container finished" podID="56aaddce41360941fa905aefab9b7fdd" containerID="5c9d8d7c9f13dca751a2f38f67d65b225df2e39d4e8eb13c593680528cf14b1e" exitCode=0 Apr 17 16:31:25.177762 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:25.177670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" event={"ID":"56aaddce41360941fa905aefab9b7fdd","Type":"ContainerDied","Data":"5c9d8d7c9f13dca751a2f38f67d65b225df2e39d4e8eb13c593680528cf14b1e"} Apr 17 16:31:26.138073 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:26.138038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:26.138247 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:26.138190 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:26.206856 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:26.206818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" event={"ID":"56aaddce41360941fa905aefab9b7fdd","Type":"ContainerStarted","Data":"c114c9bad2f7437e5c5220797eec04c60074c55fa608a88278953fc4e7e5ec94"} Apr 17 16:31:26.664520 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:26.664462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:26.664700 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:26.664559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6svm9\" (UniqueName: \"kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9\") pod \"network-check-target-w2dvt\" (UID: \"a93503d7-3f06-47f2-95b0-7c4d48999d28\") " pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:26.664700 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:26.664696 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:26.664804 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:26.664713 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:26.664804 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:26.664727 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6svm9 for pod openshift-network-diagnostics/network-check-target-w2dvt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:26.664804 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:26.664785 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9 podName:a93503d7-3f06-47f2-95b0-7c4d48999d28 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:30.664767437 +0000 UTC m=+10.129950278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6svm9" (UniqueName: "kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9") pod "network-check-target-w2dvt" (UID: "a93503d7-3f06-47f2-95b0-7c4d48999d28") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:26.664961 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:26.664820 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:26.664961 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:26.664885 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs podName:381620fe-22c1-43ed-99de-f61c1b4886ca nodeName:}" failed. No retries permitted until 2026-04-17 16:31:30.664868062 +0000 UTC m=+10.130050916 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs") pod "network-metrics-daemon-jn9vc" (UID: "381620fe-22c1-43ed-99de-f61c1b4886ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:27.140575 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:27.140326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:27.140575 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:27.140466 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:28.138728 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:28.138685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:28.139202 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:28.138842 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:29.138664 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:29.138625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:29.138858 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:29.138758 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:30.138512 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:30.138462 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:30.138698 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:30.138623 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:30.698805 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:30.698376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6svm9\" (UniqueName: \"kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9\") pod \"network-check-target-w2dvt\" (UID: \"a93503d7-3f06-47f2-95b0-7c4d48999d28\") " pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:30.698805 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:30.698429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:30.698805 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:30.698565 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:30.698805 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:30.698593 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:30.698805 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:30.698619 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:30.698805 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:30.698633 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs podName:381620fe-22c1-43ed-99de-f61c1b4886ca nodeName:}" failed. No retries permitted until 2026-04-17 16:31:38.698613775 +0000 UTC m=+18.163796638 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs") pod "network-metrics-daemon-jn9vc" (UID: "381620fe-22c1-43ed-99de-f61c1b4886ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:30.698805 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:30.698636 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6svm9 for pod openshift-network-diagnostics/network-check-target-w2dvt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:30.698805 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:30.698692 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9 podName:a93503d7-3f06-47f2-95b0-7c4d48999d28 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:38.698675228 +0000 UTC m=+18.163858074 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6svm9" (UniqueName: "kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9") pod "network-check-target-w2dvt" (UID: "a93503d7-3f06-47f2-95b0-7c4d48999d28") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:31.140379 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:31.140291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:31.140573 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:31.140429 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:32.137806 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:32.137772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:32.138242 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:32.137904 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:33.138219 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:33.138181 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:33.138709 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:33.138307 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:34.138569 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:34.138534 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:34.139043 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:34.138668 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:35.138429 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:35.138389 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:35.138627 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:35.138529 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:36.138177 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:36.138137 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:36.138337 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:36.138265 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:37.138563 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:37.138490 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:37.139112 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:37.138631 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:38.138449 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:38.138405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:38.138628 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:38.138570 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:38.756738 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:38.756698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6svm9\" (UniqueName: \"kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9\") pod \"network-check-target-w2dvt\" (UID: \"a93503d7-3f06-47f2-95b0-7c4d48999d28\") " pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:38.756927 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:38.756757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:38.756927 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:38.756859 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:38.756927 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:38.756881 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:38.756927 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:38.756908 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:38.756927 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:38.756923 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6svm9 for pod openshift-network-diagnostics/network-check-target-w2dvt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:38.757137 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:38.756925 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs podName:381620fe-22c1-43ed-99de-f61c1b4886ca nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.756907685 +0000 UTC m=+34.222090524 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs") pod "network-metrics-daemon-jn9vc" (UID: "381620fe-22c1-43ed-99de-f61c1b4886ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:38.757137 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:38.756981 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9 podName:a93503d7-3f06-47f2-95b0-7c4d48999d28 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.756964664 +0000 UTC m=+34.222147509 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6svm9" (UniqueName: "kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9") pod "network-check-target-w2dvt" (UID: "a93503d7-3f06-47f2-95b0-7c4d48999d28") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:39.138171 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:39.138099 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:39.138293 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:39.138202 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:40.138407 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:40.138370 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:40.138889 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:40.138519 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:41.141234 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.140636 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:41.141234 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:41.140752 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:41.235235 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.235185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2wn98" event={"ID":"03ed97e0-9a98-4b06-9c36-752ba631bdc9","Type":"ContainerStarted","Data":"c3f65054753b6bb38bc978804ce843cd281c42c7da8c7522d34387c7baf8378e"} Apr 17 16:31:41.236826 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.236786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" event={"ID":"21d6e76d-ca83-45a0-b330-7eced4280d5f","Type":"ContainerStarted","Data":"6c3b4964e5be14e52a01242f2f37a97a4dd70c084203da16e4aaf3604e12ab28"} Apr 17 16:31:41.237969 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.237946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" event={"ID":"72324db7-6720-4fe7-a0fc-23dee216601e","Type":"ContainerStarted","Data":"b7f9adc435aadb2af631ca2a80a82a201af2163c332c18b1cde38c41ee08ce60"} Apr 17 16:31:41.240024 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.239416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" event={"ID":"b6fd1c6d-2527-488b-ba57-022c924d53df","Type":"ContainerStarted","Data":"2d09a57741a2bc141899090ba45e7b16d174adb7ba7a680ee0853a77eb523cf2"} Apr 17 16:31:41.240564 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.240543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-52zsq" event={"ID":"bd3229a9-ef02-4e0e-aa80-b65af9672fee","Type":"ContainerStarted","Data":"26a507eff05007fab70d0acf0843f6576fb2cb30d186c59565fcfe888c89d64d"} Apr 17 16:31:41.245138 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.242845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8mrzt" event={"ID":"4f000c2d-d989-4f8a-8182-b5104c3a3de6","Type":"ContainerStarted","Data":"a43831b7efa9c1f1bf7d68d69b2ae3013d0ca91faba0748afa21b38960b82144"} Apr 17 16:31:41.246398 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.246356 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" event={"ID":"804a35f0-09a6-45b7-b583-0d7f038e28f8","Type":"ContainerStarted","Data":"a6bd8081e0fdb72ab9f13467a369a620ab70b7ce8d04bbe85a1c354b6909c716"} Apr 17 16:31:41.252251 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.252214 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2wn98" podStartSLOduration=3.17148912 podStartE2EDuration="20.252203779s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:31:23.853037451 +0000 UTC m=+3.318220291" lastFinishedPulling="2026-04-17 16:31:40.933752106 +0000 UTC m=+20.398934950" observedRunningTime="2026-04-17 16:31:41.251942728 +0000 UTC m=+20.717125589" watchObservedRunningTime="2026-04-17 16:31:41.252203779 +0000 UTC m=+20.717386640" Apr 17 16:31:41.252453 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.252434 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-142.ec2.internal" podStartSLOduration=19.252428833 podStartE2EDuration="19.252428833s" podCreationTimestamp="2026-04-17 16:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:26.224182561 +0000 UTC m=+5.689365423" watchObservedRunningTime="2026-04-17 16:31:41.252428833 +0000 UTC m=+20.717611696" Apr 17 16:31:41.265384 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.265342 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8mrzt" podStartSLOduration=3.4962503910000002 podStartE2EDuration="20.265328873s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:31:23.842491772 +0000 UTC m=+3.307674611" lastFinishedPulling="2026-04-17 16:31:40.611570234 +0000 UTC m=+20.076753093" observedRunningTime="2026-04-17 16:31:41.264977265 +0000 UTC m=+20.730160136" watchObservedRunningTime="2026-04-17 16:31:41.265328873 +0000 UTC m=+20.730511734" Apr 17 16:31:41.283328 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.283276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lnmnp" podStartSLOduration=3.189724289 podStartE2EDuration="20.283262977s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:31:23.840265156 +0000 UTC m=+3.305447996" lastFinishedPulling="2026-04-17 16:31:40.933803842 +0000 UTC m=+20.398986684" observedRunningTime="2026-04-17 16:31:41.28300076 +0000 UTC m=+20.748183621" watchObservedRunningTime="2026-04-17 16:31:41.283262977 +0000 UTC m=+20.748445838" Apr 17 16:31:41.301869 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:41.301816 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-52zsq" podStartSLOduration=3.176131484 podStartE2EDuration="20.301801267s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:31:23.847454422 +0000 UTC m=+3.312637265" lastFinishedPulling="2026-04-17 16:31:40.973124196 +0000 UTC m=+20.438307048" observedRunningTime="2026-04-17 16:31:41.301341413 +0000 UTC m=+20.766524274" watchObservedRunningTime="2026-04-17 16:31:41.301801267 +0000 UTC m=+20.766984127" Apr 17 16:31:42.138442 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.138420 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:42.138568 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:42.138550 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:42.251331 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.251301 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:31:42.252109 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.251647 2576 generic.go:358] "Generic (PLEG): container finished" podID="21d6e76d-ca83-45a0-b330-7eced4280d5f" containerID="14ae07d64b65ab7f0b0da66184ea9fd6f37f8e1303cb89d21781db687501e2a4" exitCode=1 Apr 17 16:31:42.252109 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.251718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" event={"ID":"21d6e76d-ca83-45a0-b330-7eced4280d5f","Type":"ContainerStarted","Data":"0b8079b4faa7283bfe7802d92937611355ac88bf0595d20250b8388f64b50bd1"} Apr 17 16:31:42.252109 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.251752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" event={"ID":"21d6e76d-ca83-45a0-b330-7eced4280d5f","Type":"ContainerStarted","Data":"0049856733a54f9863bc0fda1f9f2c833e2fcca701bcd022ee0dfb7591b93c3b"} Apr 17 16:31:42.252109 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.251766 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" event={"ID":"21d6e76d-ca83-45a0-b330-7eced4280d5f","Type":"ContainerStarted","Data":"ff157cd26e39b43f60e00cd1fb484b4efcbd3377719f9274678ce8aab90101ee"} Apr 17 16:31:42.252109 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.251779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" event={"ID":"21d6e76d-ca83-45a0-b330-7eced4280d5f","Type":"ContainerStarted","Data":"518baf8d241b4224c80360cc636574af21d9806490b4ad50b975b91c63acd9c3"} Apr 17 16:31:42.252109 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.251792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" event={"ID":"21d6e76d-ca83-45a0-b330-7eced4280d5f","Type":"ContainerDied","Data":"14ae07d64b65ab7f0b0da66184ea9fd6f37f8e1303cb89d21781db687501e2a4"} Apr 17 16:31:42.253125 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.253101 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-92jvc" event={"ID":"28b3abb3-0913-43c2-a633-54a17d3f88b8","Type":"ContainerStarted","Data":"c9103a4390332bde8d937b65c5c93d63e6c89abee1df42de54bdfa9dfdc20769"} Apr 17 16:31:42.254534 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.254490 2576 generic.go:358] "Generic (PLEG): container finished" podID="72324db7-6720-4fe7-a0fc-23dee216601e" containerID="b7f9adc435aadb2af631ca2a80a82a201af2163c332c18b1cde38c41ee08ce60" exitCode=0 Apr 17 16:31:42.254685 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.254582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" event={"ID":"72324db7-6720-4fe7-a0fc-23dee216601e","Type":"ContainerDied","Data":"b7f9adc435aadb2af631ca2a80a82a201af2163c332c18b1cde38c41ee08ce60"} Apr 17 16:31:42.255981 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.255955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rqqd6" event={"ID":"72c7f3b9-a592-4131-9cee-a95f18f9eaa3","Type":"ContainerStarted","Data":"ac5baa94732f0990e97cb442caeb2228b0bf0ede9f67b7207514ed0c016a74c2"} Apr 17 16:31:42.297969 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.297911 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rqqd6" podStartSLOduration=4.226047245 podStartE2EDuration="21.297893384s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:31:23.849422393 +0000 UTC m=+3.314605232" lastFinishedPulling="2026-04-17 16:31:40.921268525 +0000 UTC m=+20.386451371" observedRunningTime="2026-04-17 16:31:42.297763705 +0000 UTC m=+21.762946569" watchObservedRunningTime="2026-04-17 16:31:42.297893384 +0000 UTC m=+21.763076246" Apr 17 16:31:42.298365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.298336 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-92jvc" podStartSLOduration=4.212024861 podStartE2EDuration="21.298326478s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:31:23.847448137 +0000 UTC m=+3.312630979" lastFinishedPulling="2026-04-17 16:31:40.933749742 +0000 UTC m=+20.398932596" observedRunningTime="2026-04-17 16:31:42.284379441 +0000 UTC m=+21.749562304" watchObservedRunningTime="2026-04-17 16:31:42.298326478 +0000 UTC m=+21.763509341" Apr 17 16:31:42.374106 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:42.374080 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:31:43.087411 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:43.087285 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:31:42.374101231Z","UUID":"e6b46c20-b210-4836-888f-d9bbed46431c","Handler":null,"Name":"","Endpoint":""} Apr 17 16:31:43.090441 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:43.090412 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:31:43.090441 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:43.090450 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:31:43.138887 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:43.138627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:43.139073 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:43.138988 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:43.259984 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:43.259946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" event={"ID":"b6fd1c6d-2527-488b-ba57-022c924d53df","Type":"ContainerStarted","Data":"2644abbdcdc56aa0a2cd1d5054b0eb8fb28d8a2d2b9603ae6718943d59dbd09d"} Apr 17 16:31:44.138246 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:44.138205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:44.138410 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:44.138347 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:44.264119 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:44.264035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" event={"ID":"b6fd1c6d-2527-488b-ba57-022c924d53df","Type":"ContainerStarted","Data":"accec994f34f7f3a742e4babc555eb2a60592905123879b934f5041c27c9eadf"} Apr 17 16:31:44.267024 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:44.267002 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:31:44.267438 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:44.267397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" event={"ID":"21d6e76d-ca83-45a0-b330-7eced4280d5f","Type":"ContainerStarted","Data":"d741941a7c2dddc15cc4fa2fde026e137eb5b161d899ed63f33ada5221cab62b"} Apr 17 16:31:44.282556 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:44.282485 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-skkh2" podStartSLOduration=3.521323606 podStartE2EDuration="23.282471372s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:31:23.850451458 +0000 UTC m=+3.315634297" lastFinishedPulling="2026-04-17 16:31:43.611599212 +0000 UTC m=+23.076782063" observedRunningTime="2026-04-17 16:31:44.281571191 +0000 UTC m=+23.746754052" watchObservedRunningTime="2026-04-17 16:31:44.282471372 +0000 UTC m=+23.747654233" Apr 17 16:31:45.138670 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:45.138635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:45.138837 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:45.138756 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:45.236050 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:45.236008 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:45.236799 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:45.236774 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:45.270273 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:45.270242 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:45.270967 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:45.270588 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rqqd6" Apr 17 16:31:46.137836 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:46.137798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:46.138000 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:46.137932 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:46.275184 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:46.274887 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:31:46.275606 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:46.275410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" event={"ID":"21d6e76d-ca83-45a0-b330-7eced4280d5f","Type":"ContainerStarted","Data":"f8ec14f2b3f6d6adcbd3a8d2d1e80a8585465e0e907bc03df4c852e3b284af70"} Apr 17 16:31:46.276129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:46.275780 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:46.276129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:46.275806 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:46.276129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:46.275820 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:46.276129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:46.275890 2576 scope.go:117] "RemoveContainer" containerID="14ae07d64b65ab7f0b0da66184ea9fd6f37f8e1303cb89d21781db687501e2a4" Apr 17 16:31:46.296582 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:46.296350 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:46.297468 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:46.296927 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:31:47.138815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:47.138777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:47.139014 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:47.138904 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:47.278728 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:47.278689 2576 generic.go:358] "Generic (PLEG): container finished" podID="72324db7-6720-4fe7-a0fc-23dee216601e" containerID="505e98182e72eef13e23f6d5d33717aafcbd7a1e55a85b22163ccde3bf2dad0d" exitCode=0 Apr 17 16:31:47.279119 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:47.278739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" event={"ID":"72324db7-6720-4fe7-a0fc-23dee216601e","Type":"ContainerDied","Data":"505e98182e72eef13e23f6d5d33717aafcbd7a1e55a85b22163ccde3bf2dad0d"} Apr 17 16:31:47.282240 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:47.282223 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:31:47.282611 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:47.282591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" event={"ID":"21d6e76d-ca83-45a0-b330-7eced4280d5f","Type":"ContainerStarted","Data":"0dea545c92996960425270c281287174409e67e237027c11c92b055f9bdad33d"} Apr 17 16:31:47.333290 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:47.333235 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" podStartSLOduration=9.186010555 podStartE2EDuration="26.333219334s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:31:23.851686394 +0000 UTC m=+3.316869248" lastFinishedPulling="2026-04-17 16:31:40.998895175 +0000 UTC m=+20.464078027" observedRunningTime="2026-04-17 16:31:47.332765108 +0000 UTC m=+26.797947972" watchObservedRunningTime="2026-04-17 16:31:47.333219334 +0000 UTC m=+26.798402196" Apr 17 16:31:48.138462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:48.138271 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:48.138623 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:48.138565 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:48.154567 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:48.154537 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jn9vc"] Apr 17 16:31:48.157247 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:48.157224 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w2dvt"] Apr 17 16:31:48.157382 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:48.157352 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:48.157471 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:48.157453 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:48.286565 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:48.286456 2576 generic.go:358] "Generic (PLEG): container finished" podID="72324db7-6720-4fe7-a0fc-23dee216601e" containerID="5560abc8884931373ad9e18bd450fd6b5482501da88781d0531a6991f3e1c693" exitCode=0 Apr 17 16:31:48.286565 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:48.286534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" event={"ID":"72324db7-6720-4fe7-a0fc-23dee216601e","Type":"ContainerDied","Data":"5560abc8884931373ad9e18bd450fd6b5482501da88781d0531a6991f3e1c693"} Apr 17 16:31:48.286993 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:48.286632 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:48.286993 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:48.286726 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:49.290603 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:49.290566 2576 generic.go:358] "Generic (PLEG): container finished" podID="72324db7-6720-4fe7-a0fc-23dee216601e" containerID="8b3da762fc4769fcf296419ee5108d20056d71a63f41d7d7573debf88617571b" exitCode=0 Apr 17 16:31:49.290989 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:49.290628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" event={"ID":"72324db7-6720-4fe7-a0fc-23dee216601e","Type":"ContainerDied","Data":"8b3da762fc4769fcf296419ee5108d20056d71a63f41d7d7573debf88617571b"} Apr 17 16:31:50.138508 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:50.138464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:50.138724 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:50.138464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:50.138724 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:50.138605 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:50.138724 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:50.138682 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:52.138416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:52.138377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:52.139141 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:52.138526 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:52.139141 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:52.138570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:52.139141 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:52.138722 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:54.138724 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.138642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:54.139337 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.138659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:54.139337 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.138772 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2dvt" podUID="a93503d7-3f06-47f2-95b0-7c4d48999d28" Apr 17 16:31:54.139337 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.138865 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jn9vc" podUID="381620fe-22c1-43ed-99de-f61c1b4886ca" Apr 17 16:31:54.335620 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.335589 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-142.ec2.internal" event="NodeReady" Apr 17 16:31:54.335792 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.335754 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:31:54.393134 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.393057 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k5nlz"] Apr 17 16:31:54.428196 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.428168 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9vg62"] Apr 17 16:31:54.428736 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.428713 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:31:54.431791 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.431766 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:31:54.432044 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.432015 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qsxk4\"" Apr 17 16:31:54.432150 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.432085 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:31:54.433193 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.433175 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:31:54.445444 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.445422 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k5nlz"] Apr 17 16:31:54.445565 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.445449 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9vg62"] Apr 17 16:31:54.445621 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.445583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.452647 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.452624 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:31:54.452740 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.452636 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:31:54.452804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.452786 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-78gjs\"" Apr 17 16:31:54.572798 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.572760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13a6aa7e-beeb-4695-9196-0c33a34de454-tmp-dir\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.572989 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.572825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:31:54.572989 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.572912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13a6aa7e-beeb-4695-9196-0c33a34de454-config-volume\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.572989 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.572954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.573132 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.573067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfhmk\" (UniqueName: \"kubernetes.io/projected/deee814d-1106-4662-bd60-3fc176424f4c-kube-api-access-xfhmk\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:31:54.573192 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.573167 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxt4n\" (UniqueName: \"kubernetes.io/projected/13a6aa7e-beeb-4695-9196-0c33a34de454-kube-api-access-fxt4n\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.674030 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.673990 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:31:54.674030 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.674034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13a6aa7e-beeb-4695-9196-0c33a34de454-config-volume\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.674256 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.674052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.674256 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.674090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfhmk\" (UniqueName: \"kubernetes.io/projected/deee814d-1106-4662-bd60-3fc176424f4c-kube-api-access-xfhmk\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:31:54.674256 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.674136 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:54.674256 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.674148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxt4n\" (UniqueName: \"kubernetes.io/projected/13a6aa7e-beeb-4695-9196-0c33a34de454-kube-api-access-fxt4n\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.674256 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.674155 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:54.674256 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.674176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13a6aa7e-beeb-4695-9196-0c33a34de454-tmp-dir\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.674256 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.674216 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert podName:deee814d-1106-4662-bd60-3fc176424f4c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:55.174194822 +0000 UTC m=+34.639377662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert") pod "ingress-canary-k5nlz" (UID: "deee814d-1106-4662-bd60-3fc176424f4c") : secret "canary-serving-cert" not found Apr 17 16:31:54.674580 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.674272 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls podName:13a6aa7e-beeb-4695-9196-0c33a34de454 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:55.174261001 +0000 UTC m=+34.639443840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls") pod "dns-default-9vg62" (UID: "13a6aa7e-beeb-4695-9196-0c33a34de454") : secret "dns-default-metrics-tls" not found Apr 17 16:31:54.674580 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.674462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/13a6aa7e-beeb-4695-9196-0c33a34de454-tmp-dir\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.674711 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.674693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13a6aa7e-beeb-4695-9196-0c33a34de454-config-volume\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.684800 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.684769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxt4n\" (UniqueName: \"kubernetes.io/projected/13a6aa7e-beeb-4695-9196-0c33a34de454-kube-api-access-fxt4n\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:54.684947 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.684807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfhmk\" (UniqueName: \"kubernetes.io/projected/deee814d-1106-4662-bd60-3fc176424f4c-kube-api-access-xfhmk\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:31:54.775018 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.774982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:54.775181 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:54.775042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6svm9\" (UniqueName: \"kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9\") pod \"network-check-target-w2dvt\" (UID: \"a93503d7-3f06-47f2-95b0-7c4d48999d28\") " pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:54.775181 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.775134 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:54.775181 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.775171 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:54.775302 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.775185 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:54.775302 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.775198 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6svm9 for pod openshift-network-diagnostics/network-check-target-w2dvt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:54.775302 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.775233 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs podName:381620fe-22c1-43ed-99de-f61c1b4886ca nodeName:}" failed. No retries permitted until 2026-04-17 16:32:26.77518416 +0000 UTC m=+66.240367013 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs") pod "network-metrics-daemon-jn9vc" (UID: "381620fe-22c1-43ed-99de-f61c1b4886ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:54.775302 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:54.775252 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9 podName:a93503d7-3f06-47f2-95b0-7c4d48999d28 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:26.775242519 +0000 UTC m=+66.240425359 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6svm9" (UniqueName: "kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9") pod "network-check-target-w2dvt" (UID: "a93503d7-3f06-47f2-95b0-7c4d48999d28") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:55.178084 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:55.177925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:31:55.178423 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:55.178091 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:55.178423 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:55.178162 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert podName:deee814d-1106-4662-bd60-3fc176424f4c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:56.178140495 +0000 UTC m=+35.643323335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert") pod "ingress-canary-k5nlz" (UID: "deee814d-1106-4662-bd60-3fc176424f4c") : secret "canary-serving-cert" not found Apr 17 16:31:55.178423 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:55.178099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:55.178423 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:55.178163 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:55.178423 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:55.178282 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls podName:13a6aa7e-beeb-4695-9196-0c33a34de454 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:56.178259758 +0000 UTC m=+35.643442604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls") pod "dns-default-9vg62" (UID: "13a6aa7e-beeb-4695-9196-0c33a34de454") : secret "dns-default-metrics-tls" not found Apr 17 16:31:55.305456 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:55.305421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" event={"ID":"72324db7-6720-4fe7-a0fc-23dee216601e","Type":"ContainerStarted","Data":"6fe17918c26d33b8e78c9c078fab00e7dcec4a1d51da3b5a943c966602a78342"} Apr 17 16:31:56.138073 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.138037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:31:56.138276 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.138037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:31:56.142384 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.142362 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:31:56.142548 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.142382 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:31:56.142548 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.142365 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tvdp2\"" Apr 17 16:31:56.142548 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.142413 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qkztw\"" Apr 17 16:31:56.142724 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.142550 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:31:56.185778 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.185745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:31:56.185778 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.185783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:56.186161 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:56.185876 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:56.186161 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:56.185882 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:56.186161 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:56.185934 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls podName:13a6aa7e-beeb-4695-9196-0c33a34de454 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:58.18591943 +0000 UTC m=+37.651102269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls") pod "dns-default-9vg62" (UID: "13a6aa7e-beeb-4695-9196-0c33a34de454") : secret "dns-default-metrics-tls" not found Apr 17 16:31:56.186161 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:56.185947 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert podName:deee814d-1106-4662-bd60-3fc176424f4c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:58.185940324 +0000 UTC m=+37.651123162 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert") pod "ingress-canary-k5nlz" (UID: "deee814d-1106-4662-bd60-3fc176424f4c") : secret "canary-serving-cert" not found Apr 17 16:31:56.309653 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.309625 2576 generic.go:358] "Generic (PLEG): container finished" podID="72324db7-6720-4fe7-a0fc-23dee216601e" containerID="6fe17918c26d33b8e78c9c078fab00e7dcec4a1d51da3b5a943c966602a78342" exitCode=0 Apr 17 16:31:56.309800 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:56.309668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" event={"ID":"72324db7-6720-4fe7-a0fc-23dee216601e","Type":"ContainerDied","Data":"6fe17918c26d33b8e78c9c078fab00e7dcec4a1d51da3b5a943c966602a78342"} Apr 17 16:31:57.314031 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:57.313997 2576 generic.go:358] "Generic (PLEG): container finished" podID="72324db7-6720-4fe7-a0fc-23dee216601e" containerID="903c585fbad0b20617543eda5f7f57f4ab302ddfe4ea960622b91b6a94c66449" exitCode=0 Apr 17 16:31:57.314552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:57.314061 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" event={"ID":"72324db7-6720-4fe7-a0fc-23dee216601e","Type":"ContainerDied","Data":"903c585fbad0b20617543eda5f7f57f4ab302ddfe4ea960622b91b6a94c66449"} Apr 17 16:31:58.199313 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:58.199278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:31:58.199313 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:58.199317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:31:58.199555 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:58.199419 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:58.199555 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:58.199483 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert podName:deee814d-1106-4662-bd60-3fc176424f4c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.19946813 +0000 UTC m=+41.664650974 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert") pod "ingress-canary-k5nlz" (UID: "deee814d-1106-4662-bd60-3fc176424f4c") : secret "canary-serving-cert" not found Apr 17 16:31:58.199555 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:58.199419 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:58.199555 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:31:58.199544 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls podName:13a6aa7e-beeb-4695-9196-0c33a34de454 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.199533702 +0000 UTC m=+41.664716541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls") pod "dns-default-9vg62" (UID: "13a6aa7e-beeb-4695-9196-0c33a34de454") : secret "dns-default-metrics-tls" not found Apr 17 16:31:58.318707 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:58.318669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" event={"ID":"72324db7-6720-4fe7-a0fc-23dee216601e","Type":"ContainerStarted","Data":"24f32bc7a895a7c69fd439d9e339d4897b2a390efcb0bddef3191ad701d43a99"} Apr 17 16:31:58.342617 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:31:58.342566 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jkkg4" podStartSLOduration=6.179682627 podStartE2EDuration="37.342550469s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:31:23.852751335 +0000 UTC m=+3.317934184" lastFinishedPulling="2026-04-17 16:31:55.015619174 +0000 UTC m=+34.480802026" observedRunningTime="2026-04-17 16:31:58.341989498 +0000 UTC m=+37.807172363" watchObservedRunningTime="2026-04-17 16:31:58.342550469 +0000 UTC m=+37.807733327" Apr 17 16:32:02.227510 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:02.227452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:32:02.227510 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:02.227516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:32:02.227957 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:02.227604 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:02.227957 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:02.227672 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert podName:deee814d-1106-4662-bd60-3fc176424f4c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:10.227654664 +0000 UTC m=+49.692837503 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert") pod "ingress-canary-k5nlz" (UID: "deee814d-1106-4662-bd60-3fc176424f4c") : secret "canary-serving-cert" not found Apr 17 16:32:02.227957 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:02.227604 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:02.227957 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:02.227707 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls podName:13a6aa7e-beeb-4695-9196-0c33a34de454 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:10.227699281 +0000 UTC m=+49.692882120 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls") pod "dns-default-9vg62" (UID: "13a6aa7e-beeb-4695-9196-0c33a34de454") : secret "dns-default-metrics-tls" not found Apr 17 16:32:10.285313 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:10.285270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:32:10.285313 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:10.285313 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:32:10.285939 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:10.285436 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:10.285939 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:10.285441 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:10.285939 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:10.285523 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls podName:13a6aa7e-beeb-4695-9196-0c33a34de454 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:26.285487178 +0000 UTC m=+65.750670019 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls") pod "dns-default-9vg62" (UID: "13a6aa7e-beeb-4695-9196-0c33a34de454") : secret "dns-default-metrics-tls" not found Apr 17 16:32:10.285939 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:10.285538 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert podName:deee814d-1106-4662-bd60-3fc176424f4c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:26.285530966 +0000 UTC m=+65.750713805 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert") pod "ingress-canary-k5nlz" (UID: "deee814d-1106-4662-bd60-3fc176424f4c") : secret "canary-serving-cert" not found Apr 17 16:32:18.298087 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:18.298059 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-twl5m" Apr 17 16:32:26.298355 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:26.298322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:32:26.298711 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:26.298424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:32:26.298711 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:26.298472 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:26.298711 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:26.298510 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:26.298711 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:26.298558 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls podName:13a6aa7e-beeb-4695-9196-0c33a34de454 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:58.298538101 +0000 UTC m=+97.763720955 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls") pod "dns-default-9vg62" (UID: "13a6aa7e-beeb-4695-9196-0c33a34de454") : secret "dns-default-metrics-tls" not found Apr 17 16:32:26.298711 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:26.298572 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert podName:deee814d-1106-4662-bd60-3fc176424f4c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:58.298566406 +0000 UTC m=+97.763749245 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert") pod "ingress-canary-k5nlz" (UID: "deee814d-1106-4662-bd60-3fc176424f4c") : secret "canary-serving-cert" not found Apr 17 16:32:26.802408 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:26.802364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:32:26.802606 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:26.802425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6svm9\" (UniqueName: \"kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9\") pod \"network-check-target-w2dvt\" (UID: \"a93503d7-3f06-47f2-95b0-7c4d48999d28\") " pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:32:26.805730 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:26.805708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:32:26.805835 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:26.805753 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:32:26.812927 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:26.812910 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:32:26.812989 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:26.812979 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs podName:381620fe-22c1-43ed-99de-f61c1b4886ca nodeName:}" failed. No retries permitted until 2026-04-17 16:33:30.81295994 +0000 UTC m=+130.278142781 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs") pod "network-metrics-daemon-jn9vc" (UID: "381620fe-22c1-43ed-99de-f61c1b4886ca") : secret "metrics-daemon-secret" not found Apr 17 16:32:26.815171 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:26.815158 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:32:26.826351 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:26.826323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6svm9\" (UniqueName: \"kubernetes.io/projected/a93503d7-3f06-47f2-95b0-7c4d48999d28-kube-api-access-6svm9\") pod \"network-check-target-w2dvt\" (UID: \"a93503d7-3f06-47f2-95b0-7c4d48999d28\") " pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:32:27.056244 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:27.056169 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tvdp2\"" Apr 17 16:32:27.064177 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:27.064159 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:32:27.185653 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:27.185624 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w2dvt"] Apr 17 16:32:27.189316 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:32:27.189288 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda93503d7_3f06_47f2_95b0_7c4d48999d28.slice/crio-c206e7a0d0ba16ac9b56bc29ccacfd3d8cbf9eb71a04bd81117bf088e4ba124b WatchSource:0}: Error finding container c206e7a0d0ba16ac9b56bc29ccacfd3d8cbf9eb71a04bd81117bf088e4ba124b: Status 404 returned error can't find the container with id c206e7a0d0ba16ac9b56bc29ccacfd3d8cbf9eb71a04bd81117bf088e4ba124b Apr 17 16:32:27.375290 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:27.375208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w2dvt" event={"ID":"a93503d7-3f06-47f2-95b0-7c4d48999d28","Type":"ContainerStarted","Data":"c206e7a0d0ba16ac9b56bc29ccacfd3d8cbf9eb71a04bd81117bf088e4ba124b"} Apr 17 16:32:30.382982 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:30.382938 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w2dvt" event={"ID":"a93503d7-3f06-47f2-95b0-7c4d48999d28","Type":"ContainerStarted","Data":"3487ae985d23e2fab8154e68ad0fab8f1919c8e55d8d074bd816634ae30607f2"} Apr 17 16:32:30.383363 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:30.383068 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:32:30.400336 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:30.400280 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-w2dvt" podStartSLOduration=66.80772631 podStartE2EDuration="1m9.400265793s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:32:27.191038052 +0000 UTC m=+66.656220892" lastFinishedPulling="2026-04-17 16:32:29.783577534 +0000 UTC m=+69.248760375" observedRunningTime="2026-04-17 16:32:30.399554302 +0000 UTC m=+69.864737165" watchObservedRunningTime="2026-04-17 16:32:30.400265793 +0000 UTC m=+69.865448679" Apr 17 16:32:58.321791 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:58.321728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:32:58.321791 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:32:58.321796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:32:58.322316 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:58.321882 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:58.322316 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:58.321907 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:58.322316 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:58.321962 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert podName:deee814d-1106-4662-bd60-3fc176424f4c nodeName:}" failed. No retries permitted until 2026-04-17 16:34:02.32194579 +0000 UTC m=+161.787128629 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert") pod "ingress-canary-k5nlz" (UID: "deee814d-1106-4662-bd60-3fc176424f4c") : secret "canary-serving-cert" not found Apr 17 16:32:58.322316 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:32:58.321977 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls podName:13a6aa7e-beeb-4695-9196-0c33a34de454 nodeName:}" failed. No retries permitted until 2026-04-17 16:34:02.32197067 +0000 UTC m=+161.787153508 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls") pod "dns-default-9vg62" (UID: "13a6aa7e-beeb-4695-9196-0c33a34de454") : secret "dns-default-metrics-tls" not found Apr 17 16:33:00.338041 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.338010 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf"] Apr 17 16:33:00.342207 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.342190 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:00.344707 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.344686 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 16:33:00.344899 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.344884 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:00.344956 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.344885 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:00.344956 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.344885 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-rcwtr\"" Apr 17 16:33:00.347546 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.347523 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf"] Apr 17 16:33:00.435262 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.435227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:00.435439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.435271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7fw\" (UniqueName: \"kubernetes.io/projected/8c5c7595-87f8-485a-9f46-0e2372f5ceed-kube-api-access-bn7fw\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:00.451809 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.451713 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7bf9cdf774-5ntqx"] Apr 17 16:33:00.454487 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.454468 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.457173 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.457154 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 16:33:00.457285 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.457220 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 16:33:00.457430 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.457405 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 16:33:00.457563 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.457448 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2m6bp\"" Apr 17 16:33:00.457563 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.457472 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 16:33:00.457563 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.457481 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 16:33:00.457563 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.457520 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 16:33:00.468202 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.468184 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7bf9cdf774-5ntqx"] Apr 17 16:33:00.535918 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.535890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.535918 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.535922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.536110 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.535950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:00.536110 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:00.536026 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:00.536110 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:00.536083 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls podName:8c5c7595-87f8-485a-9f46-0e2372f5ceed nodeName:}" failed. No retries permitted until 2026-04-17 16:33:01.036068469 +0000 UTC m=+100.501251308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vngwf" (UID: "8c5c7595-87f8-485a-9f46-0e2372f5ceed") : secret "samples-operator-tls" not found Apr 17 16:33:00.536110 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.536096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rvkh\" (UniqueName: \"kubernetes.io/projected/0e0df564-3d93-4f67-a3bd-4157590025f5-kube-api-access-9rvkh\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.536237 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.536115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7fw\" (UniqueName: \"kubernetes.io/projected/8c5c7595-87f8-485a-9f46-0e2372f5ceed-kube-api-access-bn7fw\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:00.536237 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.536133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-default-certificate\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.536237 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.536189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-stats-auth\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.544420 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.544396 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc"] Apr 17 16:33:00.547120 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.547105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc" Apr 17 16:33:00.552454 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.551919 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-xszbn\"" Apr 17 16:33:00.556720 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.556695 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m"] Apr 17 16:33:00.559214 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.559196 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl"] Apr 17 16:33:00.559375 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.559357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:00.561726 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.561709 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d9bf8cf54-j9p76"] Apr 17 16:33:00.561858 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.561842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:00.564200 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.564177 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 16:33:00.564491 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.564471 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc"] Apr 17 16:33:00.564620 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.564605 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.564915 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.564894 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 16:33:00.565042 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.565028 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zvc5w\"" Apr 17 16:33:00.565130 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.565111 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:00.565671 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.565652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7fw\" (UniqueName: \"kubernetes.io/projected/8c5c7595-87f8-485a-9f46-0e2372f5ceed-kube-api-access-bn7fw\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:00.567631 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.567614 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:00.570279 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.570256 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:33:00.570387 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.570265 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 16:33:00.572656 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.572638 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:33:00.572991 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.572969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 16:33:00.573063 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.573022 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:33:00.573928 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.573909 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w6hp7\"" Apr 17 16:33:00.575545 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.575525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-n2g2q\"" Apr 17 16:33:00.578786 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.578765 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl"] Apr 17 16:33:00.588826 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.588778 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m"] Apr 17 16:33:00.596939 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.596921 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:33:00.624764 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.624736 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9bf8cf54-j9p76"] Apr 17 16:33:00.637205 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jpsl\" (UniqueName: \"kubernetes.io/projected/e3a64c88-6bbb-4c15-a090-bf22f9988ce1-kube-api-access-5jpsl\") pod \"network-check-source-8894fc9bd-98qjc\" (UID: \"e3a64c88-6bbb-4c15-a090-bf22f9988ce1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc" Apr 17 16:33:00.637205 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90c4edf8-ec15-4ea5-b1c6-ede355520321-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-ctq4m\" (UID: \"90c4edf8-ec15-4ea5-b1c6-ede355520321\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:00.637399 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bdecf6bd-ce4a-4be2-8117-e763391f52a2-ca-trust-extracted\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.637399 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rvkh\" (UniqueName: \"kubernetes.io/projected/0e0df564-3d93-4f67-a3bd-4157590025f5-kube-api-access-9rvkh\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.637399 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-trusted-ca\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.637399 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-default-certificate\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.637620 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-stats-auth\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.637620 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637479 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90c4edf8-ec15-4ea5-b1c6-ede355520321-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-ctq4m\" (UID: \"90c4edf8-ec15-4ea5-b1c6-ede355520321\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:00.637620 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchrd\" (UniqueName: \"kubernetes.io/projected/90c4edf8-ec15-4ea5-b1c6-ede355520321-kube-api-access-xchrd\") pod \"kube-storage-version-migrator-operator-6769c5d45-ctq4m\" (UID: \"90c4edf8-ec15-4ea5-b1c6-ede355520321\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:00.637620 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a1196930-a0cd-4cbf-9129-12a1bd8f7941-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:00.637620 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-image-registry-private-configuration\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.637893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-bound-sa-token\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.637893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:00.637893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.637893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mklph\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-kube-api-access-mklph\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.637893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.637893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.637893 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:00.637826 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle podName:0e0df564-3d93-4f67-a3bd-4157590025f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:01.137805431 +0000 UTC m=+100.602988294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle") pod "router-default-7bf9cdf774-5ntqx" (UID: "0e0df564-3d93-4f67-a3bd-4157590025f5") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:00.637893 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:00.637878 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:00.638182 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-certificates\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.638182 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:00.637935 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs podName:0e0df564-3d93-4f67-a3bd-4157590025f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:01.137917021 +0000 UTC m=+100.603099870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs") pod "router-default-7bf9cdf774-5ntqx" (UID: "0e0df564-3d93-4f67-a3bd-4157590025f5") : secret "router-metrics-certs-default" not found Apr 17 16:33:00.638182 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.637978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-installation-pull-secrets\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.639917 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.639897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-default-certificate\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.640051 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.639918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-stats-auth\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.647121 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.647098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rvkh\" (UniqueName: \"kubernetes.io/projected/0e0df564-3d93-4f67-a3bd-4157590025f5-kube-api-access-9rvkh\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:00.738624 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.738587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90c4edf8-ec15-4ea5-b1c6-ede355520321-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-ctq4m\" (UID: \"90c4edf8-ec15-4ea5-b1c6-ede355520321\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:00.738624 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.738634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xchrd\" (UniqueName: \"kubernetes.io/projected/90c4edf8-ec15-4ea5-b1c6-ede355520321-kube-api-access-xchrd\") pod \"kube-storage-version-migrator-operator-6769c5d45-ctq4m\" (UID: \"90c4edf8-ec15-4ea5-b1c6-ede355520321\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:00.738852 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.738656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a1196930-a0cd-4cbf-9129-12a1bd8f7941-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:00.738852 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.738682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-image-registry-private-configuration\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.738852 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.738707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-bound-sa-token\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.738959 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.738926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:00.739036 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mklph\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-kube-api-access-mklph\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.739177 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:00.739060 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:00.739177 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:00.739120 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert podName:a1196930-a0cd-4cbf-9129-12a1bd8f7941 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:01.239105341 +0000 UTC m=+100.704288186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2pgl" (UID: "a1196930-a0cd-4cbf-9129-12a1bd8f7941") : secret "networking-console-plugin-cert" not found Apr 17 16:33:00.739177 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:00.739131 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:00.739177 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:00.739146 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9bf8cf54-j9p76: secret "image-registry-tls" not found Apr 17 16:33:00.739177 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.739445 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:00.739200 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls podName:bdecf6bd-ce4a-4be2-8117-e763391f52a2 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:01.239184319 +0000 UTC m=+100.704367157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls") pod "image-registry-5d9bf8cf54-j9p76" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2") : secret "image-registry-tls" not found Apr 17 16:33:00.739445 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-certificates\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.739445 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90c4edf8-ec15-4ea5-b1c6-ede355520321-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-ctq4m\" (UID: \"90c4edf8-ec15-4ea5-b1c6-ede355520321\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:00.739445 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-installation-pull-secrets\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.739445 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jpsl\" (UniqueName: \"kubernetes.io/projected/e3a64c88-6bbb-4c15-a090-bf22f9988ce1-kube-api-access-5jpsl\") pod \"network-check-source-8894fc9bd-98qjc\" (UID: \"e3a64c88-6bbb-4c15-a090-bf22f9988ce1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc" Apr 17 16:33:00.739445 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90c4edf8-ec15-4ea5-b1c6-ede355520321-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-ctq4m\" (UID: \"90c4edf8-ec15-4ea5-b1c6-ede355520321\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:00.739445 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bdecf6bd-ce4a-4be2-8117-e763391f52a2-ca-trust-extracted\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.739445 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-trusted-ca\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.739823 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a1196930-a0cd-4cbf-9129-12a1bd8f7941-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:00.739876 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.739842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bdecf6bd-ce4a-4be2-8117-e763391f52a2-ca-trust-extracted\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.740035 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.740013 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-certificates\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.740591 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.740561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-trusted-ca\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.741718 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.741692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90c4edf8-ec15-4ea5-b1c6-ede355520321-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-ctq4m\" (UID: \"90c4edf8-ec15-4ea5-b1c6-ede355520321\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:00.741862 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.741842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-installation-pull-secrets\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.741907 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.741881 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-image-registry-private-configuration\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.748842 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.748822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-bound-sa-token\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.748959 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.748941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jpsl\" (UniqueName: \"kubernetes.io/projected/e3a64c88-6bbb-4c15-a090-bf22f9988ce1-kube-api-access-5jpsl\") pod \"network-check-source-8894fc9bd-98qjc\" (UID: \"e3a64c88-6bbb-4c15-a090-bf22f9988ce1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc" Apr 17 16:33:00.750088 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.750060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchrd\" (UniqueName: \"kubernetes.io/projected/90c4edf8-ec15-4ea5-b1c6-ede355520321-kube-api-access-xchrd\") pod \"kube-storage-version-migrator-operator-6769c5d45-ctq4m\" (UID: \"90c4edf8-ec15-4ea5-b1c6-ede355520321\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:00.750183 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.750165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mklph\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-kube-api-access-mklph\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:00.858430 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.858329 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc" Apr 17 16:33:00.874133 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:00.874106 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" Apr 17 16:33:01.004887 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.004857 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc"] Apr 17 16:33:01.008272 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:01.008242 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a64c88_6bbb_4c15_a090_bf22f9988ce1.slice/crio-68f8424fc4c253843318efa4200f97ffecd81d361fcceb05fd512de5843a0d46 WatchSource:0}: Error finding container 68f8424fc4c253843318efa4200f97ffecd81d361fcceb05fd512de5843a0d46: Status 404 returned error can't find the container with id 68f8424fc4c253843318efa4200f97ffecd81d361fcceb05fd512de5843a0d46 Apr 17 16:33:01.018890 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.018870 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m"] Apr 17 16:33:01.022052 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:01.022029 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90c4edf8_ec15_4ea5_b1c6_ede355520321.slice/crio-f6c3e2ece56dba146666ab10f05989abd0e42a25823df4b46c1223f0bc5db1a9 WatchSource:0}: Error finding container f6c3e2ece56dba146666ab10f05989abd0e42a25823df4b46c1223f0bc5db1a9: Status 404 returned error can't find the container with id f6c3e2ece56dba146666ab10f05989abd0e42a25823df4b46c1223f0bc5db1a9 Apr 17 16:33:01.042855 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.042674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:01.042855 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:01.042773 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:01.042855 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:01.042847 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls podName:8c5c7595-87f8-485a-9f46-0e2372f5ceed nodeName:}" failed. No retries permitted until 2026-04-17 16:33:02.042827652 +0000 UTC m=+101.508010509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vngwf" (UID: "8c5c7595-87f8-485a-9f46-0e2372f5ceed") : secret "samples-operator-tls" not found Apr 17 16:33:01.143926 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.143842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:01.143926 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.143877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:01.144114 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:01.144012 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:01.144114 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:01.144025 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle podName:0e0df564-3d93-4f67-a3bd-4157590025f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:02.14400743 +0000 UTC m=+101.609190285 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle") pod "router-default-7bf9cdf774-5ntqx" (UID: "0e0df564-3d93-4f67-a3bd-4157590025f5") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:01.144114 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:01.144048 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs podName:0e0df564-3d93-4f67-a3bd-4157590025f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:02.144037558 +0000 UTC m=+101.609220405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs") pod "router-default-7bf9cdf774-5ntqx" (UID: "0e0df564-3d93-4f67-a3bd-4157590025f5") : secret "router-metrics-certs-default" not found Apr 17 16:33:01.244785 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.244749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:01.244996 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.244841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:01.244996 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:01.244936 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:01.244996 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:01.244974 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:01.244996 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:01.244990 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9bf8cf54-j9p76: secret "image-registry-tls" not found Apr 17 16:33:01.245189 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:01.245019 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert podName:a1196930-a0cd-4cbf-9129-12a1bd8f7941 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:02.244998413 +0000 UTC m=+101.710181255 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2pgl" (UID: "a1196930-a0cd-4cbf-9129-12a1bd8f7941") : secret "networking-console-plugin-cert" not found Apr 17 16:33:01.245189 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:01.245039 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls podName:bdecf6bd-ce4a-4be2-8117-e763391f52a2 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:02.245029636 +0000 UTC m=+101.710212480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls") pod "image-registry-5d9bf8cf54-j9p76" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2") : secret "image-registry-tls" not found Apr 17 16:33:01.388028 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.387996 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-w2dvt" Apr 17 16:33:01.444076 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.444031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" event={"ID":"90c4edf8-ec15-4ea5-b1c6-ede355520321","Type":"ContainerStarted","Data":"f6c3e2ece56dba146666ab10f05989abd0e42a25823df4b46c1223f0bc5db1a9"} Apr 17 16:33:01.445378 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.445351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc" event={"ID":"e3a64c88-6bbb-4c15-a090-bf22f9988ce1","Type":"ContainerStarted","Data":"710a16c31b0b874a0c68703f43337b2d7ff4d136c1542b8ee38d7b224cad362d"} Apr 17 16:33:01.445549 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.445382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc" event={"ID":"e3a64c88-6bbb-4c15-a090-bf22f9988ce1","Type":"ContainerStarted","Data":"68f8424fc4c253843318efa4200f97ffecd81d361fcceb05fd512de5843a0d46"} Apr 17 16:33:01.460791 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:01.460745 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98qjc" podStartSLOduration=1.460729221 podStartE2EDuration="1.460729221s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:01.46003373 +0000 UTC m=+100.925216595" watchObservedRunningTime="2026-04-17 16:33:01.460729221 +0000 UTC m=+100.925912099" Apr 17 16:33:02.052161 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:02.052123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:02.052355 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:02.052272 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:02.052355 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:02.052340 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls podName:8c5c7595-87f8-485a-9f46-0e2372f5ceed nodeName:}" failed. No retries permitted until 2026-04-17 16:33:04.052324348 +0000 UTC m=+103.517507211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vngwf" (UID: "8c5c7595-87f8-485a-9f46-0e2372f5ceed") : secret "samples-operator-tls" not found Apr 17 16:33:02.153372 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:02.153314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:02.153584 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:02.153479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:02.153584 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:02.153483 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:02.153714 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:02.153596 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs podName:0e0df564-3d93-4f67-a3bd-4157590025f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:04.153575674 +0000 UTC m=+103.618758516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs") pod "router-default-7bf9cdf774-5ntqx" (UID: "0e0df564-3d93-4f67-a3bd-4157590025f5") : secret "router-metrics-certs-default" not found Apr 17 16:33:02.153714 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:02.153617 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle podName:0e0df564-3d93-4f67-a3bd-4157590025f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:04.153607387 +0000 UTC m=+103.618790233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle") pod "router-default-7bf9cdf774-5ntqx" (UID: "0e0df564-3d93-4f67-a3bd-4157590025f5") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:02.254387 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:02.254331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:02.254627 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:02.254484 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:02.254627 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:02.254526 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9bf8cf54-j9p76: secret "image-registry-tls" not found Apr 17 16:33:02.254627 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:02.254587 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls podName:bdecf6bd-ce4a-4be2-8117-e763391f52a2 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:04.254567345 +0000 UTC m=+103.719750187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls") pod "image-registry-5d9bf8cf54-j9p76" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2") : secret "image-registry-tls" not found Apr 17 16:33:02.254786 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:02.254669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:02.254821 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:02.254805 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:02.254887 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:02.254873 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert podName:a1196930-a0cd-4cbf-9129-12a1bd8f7941 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:04.254856786 +0000 UTC m=+103.720039627 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2pgl" (UID: "a1196930-a0cd-4cbf-9129-12a1bd8f7941") : secret "networking-console-plugin-cert" not found Apr 17 16:33:03.451258 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:03.451224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" event={"ID":"90c4edf8-ec15-4ea5-b1c6-ede355520321","Type":"ContainerStarted","Data":"90e84b2413b48eb00e5b6c852781097bf65a40ca238bce273a64f73209606055"} Apr 17 16:33:03.472841 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:03.470554 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" podStartSLOduration=1.217334806 podStartE2EDuration="3.470535657s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="2026-04-17 16:33:01.024172602 +0000 UTC m=+100.489355446" lastFinishedPulling="2026-04-17 16:33:03.277373458 +0000 UTC m=+102.742556297" observedRunningTime="2026-04-17 16:33:03.468545256 +0000 UTC m=+102.933728120" watchObservedRunningTime="2026-04-17 16:33:03.470535657 +0000 UTC m=+102.935718520" Apr 17 16:33:04.070732 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:04.070692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:04.070887 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:04.070824 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:04.070887 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:04.070886 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls podName:8c5c7595-87f8-485a-9f46-0e2372f5ceed nodeName:}" failed. No retries permitted until 2026-04-17 16:33:08.070870429 +0000 UTC m=+107.536053268 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vngwf" (UID: "8c5c7595-87f8-485a-9f46-0e2372f5ceed") : secret "samples-operator-tls" not found Apr 17 16:33:04.171576 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:04.171541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:04.171576 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:04.171581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:04.171755 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:04.171706 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle podName:0e0df564-3d93-4f67-a3bd-4157590025f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:08.171686414 +0000 UTC m=+107.636869257 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle") pod "router-default-7bf9cdf774-5ntqx" (UID: "0e0df564-3d93-4f67-a3bd-4157590025f5") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:04.171798 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:04.171779 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:04.171865 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:04.171854 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs podName:0e0df564-3d93-4f67-a3bd-4157590025f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:08.171834783 +0000 UTC m=+107.637017624 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs") pod "router-default-7bf9cdf774-5ntqx" (UID: "0e0df564-3d93-4f67-a3bd-4157590025f5") : secret "router-metrics-certs-default" not found Apr 17 16:33:04.272059 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:04.272018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:04.272268 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:04.272084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:04.272268 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:04.272173 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:04.272268 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:04.272193 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:04.272268 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:04.272215 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9bf8cf54-j9p76: secret "image-registry-tls" not found Apr 17 16:33:04.272268 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:04.272253 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert podName:a1196930-a0cd-4cbf-9129-12a1bd8f7941 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:08.272230376 +0000 UTC m=+107.737413229 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2pgl" (UID: "a1196930-a0cd-4cbf-9129-12a1bd8f7941") : secret "networking-console-plugin-cert" not found Apr 17 16:33:04.272268 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:04.272271 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls podName:bdecf6bd-ce4a-4be2-8117-e763391f52a2 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:08.272261789 +0000 UTC m=+107.737444628 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls") pod "image-registry-5d9bf8cf54-j9p76" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2") : secret "image-registry-tls" not found Apr 17 16:33:07.891986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:07.891958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2wn98_03ed97e0-9a98-4b06-9c36-752ba631bdc9/dns-node-resolver/0.log" Apr 17 16:33:08.101433 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:08.101393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:08.101610 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:08.101552 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:08.101656 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:08.101618 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls podName:8c5c7595-87f8-485a-9f46-0e2372f5ceed nodeName:}" failed. No retries permitted until 2026-04-17 16:33:16.101601344 +0000 UTC m=+115.566784187 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vngwf" (UID: "8c5c7595-87f8-485a-9f46-0e2372f5ceed") : secret "samples-operator-tls" not found Apr 17 16:33:08.202662 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:08.202599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:08.202853 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:08.202752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:08.202853 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:08.202802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle podName:0e0df564-3d93-4f67-a3bd-4157590025f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:16.202772022 +0000 UTC m=+115.667954875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle") pod "router-default-7bf9cdf774-5ntqx" (UID: "0e0df564-3d93-4f67-a3bd-4157590025f5") : configmap references non-existent config key: service-ca.crt Apr 17 16:33:08.202853 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:08.202834 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:33:08.203029 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:08.202871 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs podName:0e0df564-3d93-4f67-a3bd-4157590025f5 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:16.202860358 +0000 UTC m=+115.668043197 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs") pod "router-default-7bf9cdf774-5ntqx" (UID: "0e0df564-3d93-4f67-a3bd-4157590025f5") : secret "router-metrics-certs-default" not found Apr 17 16:33:08.304152 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:08.304109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:08.304291 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:08.304167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:08.304291 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:08.304250 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:08.304291 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:08.304279 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:08.304291 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:08.304290 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d9bf8cf54-j9p76: secret "image-registry-tls" not found Apr 17 16:33:08.304441 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:08.304317 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert podName:a1196930-a0cd-4cbf-9129-12a1bd8f7941 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:16.304300707 +0000 UTC m=+115.769483551 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2pgl" (UID: "a1196930-a0cd-4cbf-9129-12a1bd8f7941") : secret "networking-console-plugin-cert" not found Apr 17 16:33:08.304441 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:08.304333 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls podName:bdecf6bd-ce4a-4be2-8117-e763391f52a2 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:16.304325533 +0000 UTC m=+115.769508372 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls") pod "image-registry-5d9bf8cf54-j9p76" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2") : secret "image-registry-tls" not found Apr 17 16:33:08.894341 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:08.894311 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8mrzt_4f000c2d-d989-4f8a-8182-b5104c3a3de6/node-ca/0.log" Apr 17 16:33:10.692931 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:10.692899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-ctq4m_90c4edf8-ec15-4ea5-b1c6-ede355520321/kube-storage-version-migrator-operator/0.log" Apr 17 16:33:16.169179 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.169138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:16.171629 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.171600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c5c7595-87f8-485a-9f46-0e2372f5ceed-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vngwf\" (UID: \"8c5c7595-87f8-485a-9f46-0e2372f5ceed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:16.251462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.251410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" Apr 17 16:33:16.269967 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.269941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:16.270100 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.269973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:16.270763 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.270735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0df564-3d93-4f67-a3bd-4157590025f5-service-ca-bundle\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:16.272390 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.272366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e0df564-3d93-4f67-a3bd-4157590025f5-metrics-certs\") pod \"router-default-7bf9cdf774-5ntqx\" (UID: \"0e0df564-3d93-4f67-a3bd-4157590025f5\") " pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:16.363122 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.363092 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:16.368982 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.368953 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf"] Apr 17 16:33:16.370974 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.370951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:16.371061 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.370997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:16.371151 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:16.371130 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:16.371244 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:16.371230 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert podName:a1196930-a0cd-4cbf-9129-12a1bd8f7941 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:32.371207999 +0000 UTC m=+131.836390843 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s2pgl" (UID: "a1196930-a0cd-4cbf-9129-12a1bd8f7941") : secret "networking-console-plugin-cert" not found Apr 17 16:33:16.373691 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.373664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls\") pod \"image-registry-5d9bf8cf54-j9p76\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:16.476360 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.476318 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" event={"ID":"8c5c7595-87f8-485a-9f46-0e2372f5ceed","Type":"ContainerStarted","Data":"cd3369159f371fe99e7b8b978558d7af484c7ed4b229e747a2864d5d8d071e84"} Apr 17 16:33:16.484988 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.484951 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7bf9cdf774-5ntqx"] Apr 17 16:33:16.485174 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.485050 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:16.489965 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:16.489938 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e0df564_3d93_4f67_a3bd_4157590025f5.slice/crio-221f307c7ab18839e2f4c2ee326823db240d7c3c05dd3485f6cd2dd30ddff765 WatchSource:0}: Error finding container 221f307c7ab18839e2f4c2ee326823db240d7c3c05dd3485f6cd2dd30ddff765: Status 404 returned error can't find the container with id 221f307c7ab18839e2f4c2ee326823db240d7c3c05dd3485f6cd2dd30ddff765 Apr 17 16:33:16.612773 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:16.612741 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9bf8cf54-j9p76"] Apr 17 16:33:16.615670 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:16.615646 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdecf6bd_ce4a_4be2_8117_e763391f52a2.slice/crio-1af4f3e0f4d57f988d0278dc8fbe638b965456e51cd1f6735c7d30bcd0cd6829 WatchSource:0}: Error finding container 1af4f3e0f4d57f988d0278dc8fbe638b965456e51cd1f6735c7d30bcd0cd6829: Status 404 returned error can't find the container with id 1af4f3e0f4d57f988d0278dc8fbe638b965456e51cd1f6735c7d30bcd0cd6829 Apr 17 16:33:17.480559 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:17.480521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" event={"ID":"bdecf6bd-ce4a-4be2-8117-e763391f52a2","Type":"ContainerStarted","Data":"1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50"} Apr 17 16:33:17.480559 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:17.480560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" event={"ID":"bdecf6bd-ce4a-4be2-8117-e763391f52a2","Type":"ContainerStarted","Data":"1af4f3e0f4d57f988d0278dc8fbe638b965456e51cd1f6735c7d30bcd0cd6829"} Apr 17 16:33:17.481053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:17.480633 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:17.482037 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:17.482007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" event={"ID":"0e0df564-3d93-4f67-a3bd-4157590025f5","Type":"ContainerStarted","Data":"04c46ab36f6548e96eb3583ae21435697a550639ece461310263076fe1862bb6"} Apr 17 16:33:17.482037 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:17.482039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" event={"ID":"0e0df564-3d93-4f67-a3bd-4157590025f5","Type":"ContainerStarted","Data":"221f307c7ab18839e2f4c2ee326823db240d7c3c05dd3485f6cd2dd30ddff765"} Apr 17 16:33:17.500288 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:17.500240 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" podStartSLOduration=17.500226422 podStartE2EDuration="17.500226422s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:17.499618303 +0000 UTC m=+116.964801211" watchObservedRunningTime="2026-04-17 16:33:17.500226422 +0000 UTC m=+116.965409280" Apr 17 16:33:17.517533 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:17.517453 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" podStartSLOduration=17.517434265 podStartE2EDuration="17.517434265s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:17.516870885 +0000 UTC m=+116.982053747" watchObservedRunningTime="2026-04-17 16:33:17.517434265 +0000 UTC m=+116.982617137" Apr 17 16:33:18.364265 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:18.364181 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:18.366840 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:18.366818 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:18.486464 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:18.486422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" event={"ID":"8c5c7595-87f8-485a-9f46-0e2372f5ceed","Type":"ContainerStarted","Data":"1d5c451c2f6227a7b50935ab281b606069645b5a164a906f854dbad7eba8d5d9"} Apr 17 16:33:18.486893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:18.486472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" event={"ID":"8c5c7595-87f8-485a-9f46-0e2372f5ceed","Type":"ContainerStarted","Data":"d9d458ca25284288e565995f2878e633c3f21b87b2807750eda3a540694f53d1"} Apr 17 16:33:18.486893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:18.486848 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:18.487932 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:18.487906 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7bf9cdf774-5ntqx" Apr 17 16:33:18.504959 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:18.504915 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vngwf" podStartSLOduration=16.825218895 podStartE2EDuration="18.504902089s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="2026-04-17 16:33:16.404586943 +0000 UTC m=+115.869769782" lastFinishedPulling="2026-04-17 16:33:18.084270137 +0000 UTC m=+117.549452976" observedRunningTime="2026-04-17 16:33:18.504842952 +0000 UTC m=+117.970025814" watchObservedRunningTime="2026-04-17 16:33:18.504902089 +0000 UTC m=+117.970084954" Apr 17 16:33:26.280007 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.279975 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d9bf8cf54-j9p76"] Apr 17 16:33:26.293640 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.293610 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qqp2w"] Apr 17 16:33:26.298555 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.298535 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.301411 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.301380 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:33:26.301577 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.301529 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:33:26.301577 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.301570 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:33:26.301777 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.301760 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:33:26.301820 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.301784 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-shqjl\"" Apr 17 16:33:26.320053 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.320034 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qqp2w"] Apr 17 16:33:26.370625 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.370604 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d78769859-djn2h"] Apr 17 16:33:26.373485 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.373468 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.404866 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.404845 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d78769859-djn2h"] Apr 17 16:33:26.443773 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.443714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3890b08f-1a41-43ca-8a22-786c4ea642f2-crio-socket\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.443773 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.443742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr4s5\" (UniqueName: \"kubernetes.io/projected/3890b08f-1a41-43ca-8a22-786c4ea642f2-kube-api-access-nr4s5\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.443773 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.443763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3890b08f-1a41-43ca-8a22-786c4ea642f2-data-volume\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.443920 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.443857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3890b08f-1a41-43ca-8a22-786c4ea642f2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.443920 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.443882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3890b08f-1a41-43ca-8a22-786c4ea642f2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.544759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.544723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1f4421a-5f1c-468c-9392-a80b17d865af-registry-certificates\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.544905 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.544768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3890b08f-1a41-43ca-8a22-786c4ea642f2-data-volume\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.544905 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.544825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3890b08f-1a41-43ca-8a22-786c4ea642f2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.544905 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.544848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3890b08f-1a41-43ca-8a22-786c4ea642f2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.544905 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.544900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1f4421a-5f1c-468c-9392-a80b17d865af-ca-trust-extracted\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.545074 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.544927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1f4421a-5f1c-468c-9392-a80b17d865af-registry-tls\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.545074 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.544957 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1f4421a-5f1c-468c-9392-a80b17d865af-trusted-ca\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.545074 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.545038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1f4421a-5f1c-468c-9392-a80b17d865af-image-registry-private-configuration\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.545198 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.545095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x4nv\" (UniqueName: \"kubernetes.io/projected/d1f4421a-5f1c-468c-9392-a80b17d865af-kube-api-access-8x4nv\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.545198 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.545118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3890b08f-1a41-43ca-8a22-786c4ea642f2-crio-socket\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.545198 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.545135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr4s5\" (UniqueName: \"kubernetes.io/projected/3890b08f-1a41-43ca-8a22-786c4ea642f2-kube-api-access-nr4s5\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.545198 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.545145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3890b08f-1a41-43ca-8a22-786c4ea642f2-data-volume\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.545198 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.545162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1f4421a-5f1c-468c-9392-a80b17d865af-installation-pull-secrets\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.545413 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.545212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3890b08f-1a41-43ca-8a22-786c4ea642f2-crio-socket\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.545413 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.545213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1f4421a-5f1c-468c-9392-a80b17d865af-bound-sa-token\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.545413 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.545387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3890b08f-1a41-43ca-8a22-786c4ea642f2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.547163 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.547140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3890b08f-1a41-43ca-8a22-786c4ea642f2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.558564 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.558546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr4s5\" (UniqueName: \"kubernetes.io/projected/3890b08f-1a41-43ca-8a22-786c4ea642f2-kube-api-access-nr4s5\") pod \"insights-runtime-extractor-qqp2w\" (UID: \"3890b08f-1a41-43ca-8a22-786c4ea642f2\") " pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.606826 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.606795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qqp2w" Apr 17 16:33:26.646381 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.646349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1f4421a-5f1c-468c-9392-a80b17d865af-bound-sa-token\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.646381 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.646381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1f4421a-5f1c-468c-9392-a80b17d865af-registry-certificates\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.646677 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.646441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1f4421a-5f1c-468c-9392-a80b17d865af-ca-trust-extracted\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.646761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.646723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1f4421a-5f1c-468c-9392-a80b17d865af-ca-trust-extracted\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.646828 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.646766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1f4421a-5f1c-468c-9392-a80b17d865af-registry-tls\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.646828 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.646809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1f4421a-5f1c-468c-9392-a80b17d865af-trusted-ca\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.646932 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.646859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1f4421a-5f1c-468c-9392-a80b17d865af-image-registry-private-configuration\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.646932 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.646913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8x4nv\" (UniqueName: \"kubernetes.io/projected/d1f4421a-5f1c-468c-9392-a80b17d865af-kube-api-access-8x4nv\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.647030 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.646952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1f4421a-5f1c-468c-9392-a80b17d865af-installation-pull-secrets\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.647364 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.647241 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1f4421a-5f1c-468c-9392-a80b17d865af-registry-certificates\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.648170 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.648139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1f4421a-5f1c-468c-9392-a80b17d865af-trusted-ca\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.649507 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.649305 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1f4421a-5f1c-468c-9392-a80b17d865af-image-registry-private-configuration\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.649594 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.649561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1f4421a-5f1c-468c-9392-a80b17d865af-registry-tls\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.649741 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.649722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1f4421a-5f1c-468c-9392-a80b17d865af-installation-pull-secrets\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.658026 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.657999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x4nv\" (UniqueName: \"kubernetes.io/projected/d1f4421a-5f1c-468c-9392-a80b17d865af-kube-api-access-8x4nv\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.658129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.658120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1f4421a-5f1c-468c-9392-a80b17d865af-bound-sa-token\") pod \"image-registry-d78769859-djn2h\" (UID: \"d1f4421a-5f1c-468c-9392-a80b17d865af\") " pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.680981 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.680884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:26.730941 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.730894 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qqp2w"] Apr 17 16:33:26.806959 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:26.806935 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d78769859-djn2h"] Apr 17 16:33:26.811251 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:26.811223 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1f4421a_5f1c_468c_9392_a80b17d865af.slice/crio-9dc8484b55e4af85c6f054e8238069acb8d42342a58e9dac8a742ac7280dbe85 WatchSource:0}: Error finding container 9dc8484b55e4af85c6f054e8238069acb8d42342a58e9dac8a742ac7280dbe85: Status 404 returned error can't find the container with id 9dc8484b55e4af85c6f054e8238069acb8d42342a58e9dac8a742ac7280dbe85 Apr 17 16:33:27.510865 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:27.510831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d78769859-djn2h" event={"ID":"d1f4421a-5f1c-468c-9392-a80b17d865af","Type":"ContainerStarted","Data":"1d37facb2a4e7bccc10c16bd9d8546c7eba1045f36a09e1dfeb4b5b5fcd5ef20"} Apr 17 16:33:27.510865 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:27.510869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d78769859-djn2h" event={"ID":"d1f4421a-5f1c-468c-9392-a80b17d865af","Type":"ContainerStarted","Data":"9dc8484b55e4af85c6f054e8238069acb8d42342a58e9dac8a742ac7280dbe85"} Apr 17 16:33:27.511389 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:27.510913 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:27.512328 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:27.512307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qqp2w" event={"ID":"3890b08f-1a41-43ca-8a22-786c4ea642f2","Type":"ContainerStarted","Data":"c8d2b1c4f265828d8ee694ecb07f14a12d448f1ce8598b9467f067d5e3bb2f0a"} Apr 17 16:33:27.512403 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:27.512332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qqp2w" event={"ID":"3890b08f-1a41-43ca-8a22-786c4ea642f2","Type":"ContainerStarted","Data":"30c728172a623a856ad3151d70fe80440db4036fe5ade48c804e090c0ef5de41"} Apr 17 16:33:27.512403 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:27.512342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qqp2w" event={"ID":"3890b08f-1a41-43ca-8a22-786c4ea642f2","Type":"ContainerStarted","Data":"51daaf1111f60383e9a3e3540cb40cbfc0b17b9fe9fd88f68b738b6ad89711d2"} Apr 17 16:33:27.532417 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:27.532374 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-d78769859-djn2h" podStartSLOduration=1.532363051 podStartE2EDuration="1.532363051s" podCreationTimestamp="2026-04-17 16:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:27.531263904 +0000 UTC m=+126.996446776" watchObservedRunningTime="2026-04-17 16:33:27.532363051 +0000 UTC m=+126.997545903" Apr 17 16:33:29.518867 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:29.518831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qqp2w" event={"ID":"3890b08f-1a41-43ca-8a22-786c4ea642f2","Type":"ContainerStarted","Data":"fa9dc5e0f70b6eba4434f98201b8e651118c827346fbf71c3c7fca50a4ecc8d6"} Apr 17 16:33:29.545410 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:29.545361 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qqp2w" podStartSLOduration=1.472808191 podStartE2EDuration="3.545327448s" podCreationTimestamp="2026-04-17 16:33:26 +0000 UTC" firstStartedPulling="2026-04-17 16:33:26.798547216 +0000 UTC m=+126.263730054" lastFinishedPulling="2026-04-17 16:33:28.871066456 +0000 UTC m=+128.336249311" observedRunningTime="2026-04-17 16:33:29.543613066 +0000 UTC m=+129.008795949" watchObservedRunningTime="2026-04-17 16:33:29.545327448 +0000 UTC m=+129.010510308" Apr 17 16:33:30.883228 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:30.883188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:33:30.885611 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:30.885583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/381620fe-22c1-43ed-99de-f61c1b4886ca-metrics-certs\") pod \"network-metrics-daemon-jn9vc\" (UID: \"381620fe-22c1-43ed-99de-f61c1b4886ca\") " pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:33:30.950671 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:30.950636 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qkztw\"" Apr 17 16:33:30.958967 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:30.958936 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jn9vc" Apr 17 16:33:31.078300 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:31.078264 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jn9vc"] Apr 17 16:33:31.082131 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:31.082100 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod381620fe_22c1_43ed_99de_f61c1b4886ca.slice/crio-45409edbd0ea03ef9b03c2c37e2a523e24a018c5de4186d81298eb50d436e4e6 WatchSource:0}: Error finding container 45409edbd0ea03ef9b03c2c37e2a523e24a018c5de4186d81298eb50d436e4e6: Status 404 returned error can't find the container with id 45409edbd0ea03ef9b03c2c37e2a523e24a018c5de4186d81298eb50d436e4e6 Apr 17 16:33:31.525384 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:31.525345 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jn9vc" event={"ID":"381620fe-22c1-43ed-99de-f61c1b4886ca","Type":"ContainerStarted","Data":"45409edbd0ea03ef9b03c2c37e2a523e24a018c5de4186d81298eb50d436e4e6"} Apr 17 16:33:32.394049 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:32.393955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:32.396598 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:32.396564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1196930-a0cd-4cbf-9129-12a1bd8f7941-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s2pgl\" (UID: \"a1196930-a0cd-4cbf-9129-12a1bd8f7941\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:32.529101 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:32.529064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jn9vc" event={"ID":"381620fe-22c1-43ed-99de-f61c1b4886ca","Type":"ContainerStarted","Data":"e67bbe3b0307b1278a0dde1d192483768cc21e20c636b59df72b98973f225eae"} Apr 17 16:33:32.529101 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:32.529105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jn9vc" event={"ID":"381620fe-22c1-43ed-99de-f61c1b4886ca","Type":"ContainerStarted","Data":"300bead0ed05caa0e0cce15fc3048ad3148d8dad86eb0e6f1b851d7d01f94be9"} Apr 17 16:33:32.547356 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:32.547300 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jn9vc" podStartSLOduration=130.672232683 podStartE2EDuration="2m11.547281093s" podCreationTimestamp="2026-04-17 16:31:21 +0000 UTC" firstStartedPulling="2026-04-17 16:33:31.084005854 +0000 UTC m=+130.549188693" lastFinishedPulling="2026-04-17 16:33:31.959054261 +0000 UTC m=+131.424237103" observedRunningTime="2026-04-17 16:33:32.545900651 +0000 UTC m=+132.011083512" watchObservedRunningTime="2026-04-17 16:33:32.547281093 +0000 UTC m=+132.012463955" Apr 17 16:33:32.684357 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:32.684328 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zvc5w\"" Apr 17 16:33:32.691778 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:32.691761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" Apr 17 16:33:32.833243 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:32.833208 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl"] Apr 17 16:33:32.837485 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:32.837452 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1196930_a0cd_4cbf_9129_12a1bd8f7941.slice/crio-358a68c225511cd0424c2128e166710a2d5cf1fa20b0ac1091a80c6d70af92b0 WatchSource:0}: Error finding container 358a68c225511cd0424c2128e166710a2d5cf1fa20b0ac1091a80c6d70af92b0: Status 404 returned error can't find the container with id 358a68c225511cd0424c2128e166710a2d5cf1fa20b0ac1091a80c6d70af92b0 Apr 17 16:33:33.532193 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:33.532162 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" event={"ID":"a1196930-a0cd-4cbf-9129-12a1bd8f7941","Type":"ContainerStarted","Data":"358a68c225511cd0424c2128e166710a2d5cf1fa20b0ac1091a80c6d70af92b0"} Apr 17 16:33:34.538892 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:34.538853 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" event={"ID":"a1196930-a0cd-4cbf-9129-12a1bd8f7941","Type":"ContainerStarted","Data":"d1e1e2cef02f6307e75b84c3b59f4b069a892a54b3ad5ea407ffa653594c1541"} Apr 17 16:33:34.554834 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:34.554783 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s2pgl" podStartSLOduration=33.603412756 podStartE2EDuration="34.55477159s" podCreationTimestamp="2026-04-17 16:33:00 +0000 UTC" firstStartedPulling="2026-04-17 16:33:32.839314193 +0000 UTC m=+132.304497032" lastFinishedPulling="2026-04-17 16:33:33.790673028 +0000 UTC m=+133.255855866" observedRunningTime="2026-04-17 16:33:34.553532402 +0000 UTC m=+134.018715264" watchObservedRunningTime="2026-04-17 16:33:34.55477159 +0000 UTC m=+134.019954478" Apr 17 16:33:35.064366 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:35.064333 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts"] Apr 17 16:33:35.066416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:35.066399 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" Apr 17 16:33:35.068971 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:35.068951 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-cspv9\"" Apr 17 16:33:35.068971 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:35.068964 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 16:33:35.079540 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:35.079518 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts"] Apr 17 16:33:35.218560 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:35.218530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7087f126-5e4b-4d2e-846b-cdbfab7f37b6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-84kts\" (UID: \"7087f126-5e4b-4d2e-846b-cdbfab7f37b6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" Apr 17 16:33:35.319420 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:35.319330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7087f126-5e4b-4d2e-846b-cdbfab7f37b6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-84kts\" (UID: \"7087f126-5e4b-4d2e-846b-cdbfab7f37b6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" Apr 17 16:33:35.319589 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:35.319451 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 16:33:35.319589 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:35.319540 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7087f126-5e4b-4d2e-846b-cdbfab7f37b6-tls-certificates podName:7087f126-5e4b-4d2e-846b-cdbfab7f37b6 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:35.819523608 +0000 UTC m=+135.284706452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/7087f126-5e4b-4d2e-846b-cdbfab7f37b6-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-84kts" (UID: "7087f126-5e4b-4d2e-846b-cdbfab7f37b6") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 16:33:35.821802 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:35.821764 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7087f126-5e4b-4d2e-846b-cdbfab7f37b6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-84kts\" (UID: \"7087f126-5e4b-4d2e-846b-cdbfab7f37b6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" Apr 17 16:33:35.824082 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:35.824063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7087f126-5e4b-4d2e-846b-cdbfab7f37b6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-84kts\" (UID: \"7087f126-5e4b-4d2e-846b-cdbfab7f37b6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" Apr 17 16:33:35.975054 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:35.974998 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" Apr 17 16:33:36.287236 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:36.287193 2576 patch_prober.go:28] interesting pod/image-registry-5d9bf8cf54-j9p76 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:33:36.287403 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:36.287271 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" podUID="bdecf6bd-ce4a-4be2-8117-e763391f52a2" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:33:36.304139 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:36.304080 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts"] Apr 17 16:33:36.307978 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:36.307955 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7087f126_5e4b_4d2e_846b_cdbfab7f37b6.slice/crio-50d3f84958b5b87e18d052d433025985cc37891b6c9a79443c8e81a863c15de7 WatchSource:0}: Error finding container 50d3f84958b5b87e18d052d433025985cc37891b6c9a79443c8e81a863c15de7: Status 404 returned error can't find the container with id 50d3f84958b5b87e18d052d433025985cc37891b6c9a79443c8e81a863c15de7 Apr 17 16:33:36.544621 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:36.544530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" event={"ID":"7087f126-5e4b-4d2e-846b-cdbfab7f37b6","Type":"ContainerStarted","Data":"50d3f84958b5b87e18d052d433025985cc37891b6c9a79443c8e81a863c15de7"} Apr 17 16:33:37.548514 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:37.548465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" event={"ID":"7087f126-5e4b-4d2e-846b-cdbfab7f37b6","Type":"ContainerStarted","Data":"24d54fda16e467ec61ce183b8a652df851e3d464f87699c88e3fac21719e245b"} Apr 17 16:33:37.548958 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:37.548597 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" Apr 17 16:33:37.553219 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:37.553195 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" Apr 17 16:33:37.565655 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:37.565618 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-84kts" podStartSLOduration=1.492370137 podStartE2EDuration="2.565605363s" podCreationTimestamp="2026-04-17 16:33:35 +0000 UTC" firstStartedPulling="2026-04-17 16:33:36.309808112 +0000 UTC m=+135.774990950" lastFinishedPulling="2026-04-17 16:33:37.383043318 +0000 UTC m=+136.848226176" observedRunningTime="2026-04-17 16:33:37.564574498 +0000 UTC m=+137.029757370" watchObservedRunningTime="2026-04-17 16:33:37.565605363 +0000 UTC m=+137.030788221" Apr 17 16:33:46.289857 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:46.289705 2576 patch_prober.go:28] interesting pod/image-registry-5d9bf8cf54-j9p76 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:33:46.289857 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:46.289791 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" podUID="bdecf6bd-ce4a-4be2-8117-e763391f52a2" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:33:46.685141 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:46.685112 2576 patch_prober.go:28] interesting pod/image-registry-d78769859-djn2h container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:33:46.685284 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:46.685159 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-d78769859-djn2h" podUID="d1f4421a-5f1c-468c-9392-a80b17d865af" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:33:48.518964 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.518928 2576 patch_prober.go:28] interesting pod/image-registry-d78769859-djn2h container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:33:48.519316 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.518977 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-d78769859-djn2h" podUID="d1f4421a-5f1c-468c-9392-a80b17d865af" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:33:48.600791 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.600758 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg"] Apr 17 16:33:48.604103 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.604081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.606612 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.606590 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 16:33:48.608024 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.608002 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:33:48.608132 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.608005 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:33:48.608132 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.608084 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:33:48.608254 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.608163 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:33:48.608254 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.608163 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-svmkt\"" Apr 17 16:33:48.616006 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.615989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.616093 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.616039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.616093 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.616076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww4b8\" (UniqueName: \"kubernetes.io/projected/1c0c6bd4-7556-4400-b3cb-97a101645604-kube-api-access-ww4b8\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.616176 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.616098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0c6bd4-7556-4400-b3cb-97a101645604-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.631397 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.631373 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg"] Apr 17 16:33:48.664802 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.664652 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tkskl"] Apr 17 16:33:48.667825 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.667809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.671208 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.671190 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 16:33:48.671208 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.671196 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ms76m\"" Apr 17 16:33:48.671685 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.671667 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:33:48.684602 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.684582 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 16:33:48.705827 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.705794 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tkskl"] Apr 17 16:33:48.717202 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.717308 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ww4b8\" (UniqueName: \"kubernetes.io/projected/1c0c6bd4-7556-4400-b3cb-97a101645604-kube-api-access-ww4b8\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.717308 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.717308 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0c6bd4-7556-4400-b3cb-97a101645604-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.717427 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdnfn\" (UniqueName: \"kubernetes.io/projected/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-api-access-jdnfn\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.717427 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:48.717324 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 16:33:48.717427 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.717427 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:48.717388 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-tls podName:1c0c6bd4-7556-4400-b3cb-97a101645604 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:49.21737347 +0000 UTC m=+148.682556314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-ztfmg" (UID: "1c0c6bd4-7556-4400-b3cb-97a101645604") : secret "openshift-state-metrics-tls" not found Apr 17 16:33:48.717608 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.717608 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.717686 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.717686 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.718000 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.717984 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0c6bd4-7556-4400-b3cb-97a101645604-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.719788 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.719772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.723800 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.723779 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-t7l69"] Apr 17 16:33:48.727316 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.727299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.730652 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.730634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:33:48.730992 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.730975 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:33:48.731104 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.731088 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:33:48.731360 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.731347 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bxbxp\"" Apr 17 16:33:48.750425 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.750390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww4b8\" (UniqueName: \"kubernetes.io/projected/1c0c6bd4-7556-4400-b3cb-97a101645604-kube-api-access-ww4b8\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:48.818413 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-textfile\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.818413 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-accelerators-collector-config\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.818413 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3021cc06-14c0-4d00-b02c-90d9eaecb08f-root\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.818413 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3021cc06-14c0-4d00-b02c-90d9eaecb08f-metrics-client-ca\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.818749 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.818749 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.818749 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdnfn\" (UniqueName: \"kubernetes.io/projected/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-api-access-jdnfn\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.818749 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:48.818596 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 16:33:48.818749 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.818749 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.818749 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:48.818665 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-tls podName:bb8e36fe-6c83-40c6-8e4f-7a439d8fee12 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:49.318644271 +0000 UTC m=+148.783827127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-tkskl" (UID: "bb8e36fe-6c83-40c6-8e4f-7a439d8fee12") : secret "kube-state-metrics-tls" not found Apr 17 16:33:48.818749 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.818749 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818726 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d2fd\" (UniqueName: \"kubernetes.io/projected/3021cc06-14c0-4d00-b02c-90d9eaecb08f-kube-api-access-4d2fd\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.819188 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.819188 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3021cc06-14c0-4d00-b02c-90d9eaecb08f-sys\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.819188 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-tls\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.819188 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.818918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-wtmp\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.819188 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.819127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.819441 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.819344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.819441 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.819389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.821015 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.820997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.847022 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.846987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdnfn\" (UniqueName: \"kubernetes.io/projected/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-api-access-jdnfn\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:48.920285 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d2fd\" (UniqueName: \"kubernetes.io/projected/3021cc06-14c0-4d00-b02c-90d9eaecb08f-kube-api-access-4d2fd\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920463 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3021cc06-14c0-4d00-b02c-90d9eaecb08f-sys\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920463 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-tls\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920463 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-wtmp\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920463 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-textfile\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920463 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-accelerators-collector-config\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920463 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3021cc06-14c0-4d00-b02c-90d9eaecb08f-root\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3021cc06-14c0-4d00-b02c-90d9eaecb08f-sys\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3021cc06-14c0-4d00-b02c-90d9eaecb08f-metrics-client-ca\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-wtmp\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3021cc06-14c0-4d00-b02c-90d9eaecb08f-root\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.920980 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.920858 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-textfile\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.921083 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.921063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-accelerators-collector-config\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.921142 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.921132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3021cc06-14c0-4d00-b02c-90d9eaecb08f-metrics-client-ca\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.922773 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.922751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.922912 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.922894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3021cc06-14c0-4d00-b02c-90d9eaecb08f-node-exporter-tls\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:48.930754 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:48.930732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d2fd\" (UniqueName: \"kubernetes.io/projected/3021cc06-14c0-4d00-b02c-90d9eaecb08f-kube-api-access-4d2fd\") pod \"node-exporter-t7l69\" (UID: \"3021cc06-14c0-4d00-b02c-90d9eaecb08f\") " pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:49.036169 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:49.036128 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-t7l69" Apr 17 16:33:49.044366 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:49.044339 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3021cc06_14c0_4d00_b02c_90d9eaecb08f.slice/crio-16e8517441f3a2af26f3b2dc13f73f13ee1559c47d7208449ac4c7c2795fb919 WatchSource:0}: Error finding container 16e8517441f3a2af26f3b2dc13f73f13ee1559c47d7208449ac4c7c2795fb919: Status 404 returned error can't find the container with id 16e8517441f3a2af26f3b2dc13f73f13ee1559c47d7208449ac4c7c2795fb919 Apr 17 16:33:49.223008 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:49.222961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:49.223196 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:49.223111 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 16:33:49.223196 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:49.223176 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-tls podName:1c0c6bd4-7556-4400-b3cb-97a101645604 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:50.223158145 +0000 UTC m=+149.688340984 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-ztfmg" (UID: "1c0c6bd4-7556-4400-b3cb-97a101645604") : secret "openshift-state-metrics-tls" not found Apr 17 16:33:49.323718 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:49.323680 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:49.326043 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:49.326012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb8e36fe-6c83-40c6-8e4f-7a439d8fee12-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tkskl\" (UID: \"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:49.576094 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:49.575999 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" Apr 17 16:33:49.579197 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:49.579163 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t7l69" event={"ID":"3021cc06-14c0-4d00-b02c-90d9eaecb08f","Type":"ContainerStarted","Data":"16e8517441f3a2af26f3b2dc13f73f13ee1559c47d7208449ac4c7c2795fb919"} Apr 17 16:33:49.730026 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:49.729918 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tkskl"] Apr 17 16:33:49.863150 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:49.863073 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb8e36fe_6c83_40c6_8e4f_7a439d8fee12.slice/crio-630826d0eab2cd02067b5ced85c9f5d704fefb26e4be4e132051a08a677bb214 WatchSource:0}: Error finding container 630826d0eab2cd02067b5ced85c9f5d704fefb26e4be4e132051a08a677bb214: Status 404 returned error can't find the container with id 630826d0eab2cd02067b5ced85c9f5d704fefb26e4be4e132051a08a677bb214 Apr 17 16:33:50.231106 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.231067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:50.233523 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.233467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c0c6bd4-7556-4400-b3cb-97a101645604-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-ztfmg\" (UID: \"1c0c6bd4-7556-4400-b3cb-97a101645604\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:50.412803 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.412768 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" Apr 17 16:33:50.529474 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.529440 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg"] Apr 17 16:33:50.532765 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:50.532735 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c0c6bd4_7556_4400_b3cb_97a101645604.slice/crio-f12c96bf031273e7805ce6fb948546f1d8cecf219f909ed286b24a99ffb18725 WatchSource:0}: Error finding container f12c96bf031273e7805ce6fb948546f1d8cecf219f909ed286b24a99ffb18725: Status 404 returned error can't find the container with id f12c96bf031273e7805ce6fb948546f1d8cecf219f909ed286b24a99ffb18725 Apr 17 16:33:50.585732 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.585684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" event={"ID":"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12","Type":"ContainerStarted","Data":"630826d0eab2cd02067b5ced85c9f5d704fefb26e4be4e132051a08a677bb214"} Apr 17 16:33:50.587209 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.587177 2576 generic.go:358] "Generic (PLEG): container finished" podID="3021cc06-14c0-4d00-b02c-90d9eaecb08f" containerID="a7c523dbb89a742bb9d381834a7913582da96fc6041956f0bd27c840a064c077" exitCode=0 Apr 17 16:33:50.587323 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.587261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t7l69" event={"ID":"3021cc06-14c0-4d00-b02c-90d9eaecb08f","Type":"ContainerDied","Data":"a7c523dbb89a742bb9d381834a7913582da96fc6041956f0bd27c840a064c077"} Apr 17 16:33:50.588397 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.588373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" event={"ID":"1c0c6bd4-7556-4400-b3cb-97a101645604","Type":"ContainerStarted","Data":"f12c96bf031273e7805ce6fb948546f1d8cecf219f909ed286b24a99ffb18725"} Apr 17 16:33:50.706523 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.706435 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7d7d8b6597-4hx86"] Apr 17 16:33:50.711921 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.711895 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.714606 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.714573 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-j5t6r\"" Apr 17 16:33:50.714606 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.714598 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4m80kvc4m97fe\"" Apr 17 16:33:50.714786 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.714607 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 16:33:50.714960 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.714920 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 16:33:50.715061 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.714963 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 16:33:50.715061 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.714968 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 16:33:50.715623 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.715606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 16:33:50.723555 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.723326 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7d7d8b6597-4hx86"] Apr 17 16:33:50.736140 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.736119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-tls\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.736242 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.736196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.736300 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.736255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.736355 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.736299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-grpc-tls\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.736355 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.736333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.736471 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.736373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.736471 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.736440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-529pb\" (UniqueName: \"kubernetes.io/projected/75373c0d-030c-4f0d-beeb-fc38b539c454-kube-api-access-529pb\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.736592 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.736526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75373c0d-030c-4f0d-beeb-fc38b539c454-metrics-client-ca\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.837539 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.837475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.837539 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.837543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.837817 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.837572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-grpc-tls\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.837817 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.837597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.837817 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.837638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.837817 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.837697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-529pb\" (UniqueName: \"kubernetes.io/projected/75373c0d-030c-4f0d-beeb-fc38b539c454-kube-api-access-529pb\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.837817 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.837736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75373c0d-030c-4f0d-beeb-fc38b539c454-metrics-client-ca\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.838049 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.837937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-tls\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.838695 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.838644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75373c0d-030c-4f0d-beeb-fc38b539c454-metrics-client-ca\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.840893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.840835 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.840893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.840836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.840893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.840839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-grpc-tls\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.841135 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.841102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.841296 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.841273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.841541 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.841518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/75373c0d-030c-4f0d-beeb-fc38b539c454-secret-thanos-querier-tls\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:50.846678 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:50.846654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-529pb\" (UniqueName: \"kubernetes.io/projected/75373c0d-030c-4f0d-beeb-fc38b539c454-kube-api-access-529pb\") pod \"thanos-querier-7d7d8b6597-4hx86\" (UID: \"75373c0d-030c-4f0d-beeb-fc38b539c454\") " pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:51.037807 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.037728 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:51.298682 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.298629 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" podUID="bdecf6bd-ce4a-4be2-8117-e763391f52a2" containerName="registry" containerID="cri-o://1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50" gracePeriod=30 Apr 17 16:33:51.370585 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.370550 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7d7d8b6597-4hx86"] Apr 17 16:33:51.383653 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:51.383619 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75373c0d_030c_4f0d_beeb_fc38b539c454.slice/crio-22a8893140e5893c9c6c085abbdd759cbcd7243bc67eb231cae92d4b3f54d4d2 WatchSource:0}: Error finding container 22a8893140e5893c9c6c085abbdd759cbcd7243bc67eb231cae92d4b3f54d4d2: Status 404 returned error can't find the container with id 22a8893140e5893c9c6c085abbdd759cbcd7243bc67eb231cae92d4b3f54d4d2 Apr 17 16:33:51.554669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.554598 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:51.596637 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.595941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t7l69" event={"ID":"3021cc06-14c0-4d00-b02c-90d9eaecb08f","Type":"ContainerStarted","Data":"7abe9b717b0d4dc70ea8678b9313c48e56ec263e011faeb40e8956cf131aeb0c"} Apr 17 16:33:51.596637 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.595987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t7l69" event={"ID":"3021cc06-14c0-4d00-b02c-90d9eaecb08f","Type":"ContainerStarted","Data":"228067fd700eab13457397d763944f04b7070c2701feb8cc2c5ae6ba1600dc45"} Apr 17 16:33:51.601137 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.601102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" event={"ID":"1c0c6bd4-7556-4400-b3cb-97a101645604","Type":"ContainerStarted","Data":"dec8a576c0d4acbe4d8a1bf4148b2a218cd1491c7b20ff6807fb4a258f0a182e"} Apr 17 16:33:51.601270 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.601146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" event={"ID":"1c0c6bd4-7556-4400-b3cb-97a101645604","Type":"ContainerStarted","Data":"d1a97c90c43294bfcc7432b5fa635db93e5442e1249d978f80679c2e434bca1a"} Apr 17 16:33:51.605146 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.605122 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" event={"ID":"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12","Type":"ContainerStarted","Data":"769a73a9bdd9968d81939451e4342653e1cb5e7b0f69a6f5dd7dbab1fd59f3cb"} Apr 17 16:33:51.605334 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.605315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" event={"ID":"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12","Type":"ContainerStarted","Data":"f8cc63f3246ac0d711f91bf152492fcfaed687f10a1755be4983d0be6f6c2618"} Apr 17 16:33:51.605436 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.605424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" event={"ID":"bb8e36fe-6c83-40c6-8e4f-7a439d8fee12","Type":"ContainerStarted","Data":"74e172e82f5df85c5306764425554ef48636ad6e266312fd24d8395545339e55"} Apr 17 16:33:51.606884 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.606864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" event={"ID":"75373c0d-030c-4f0d-beeb-fc38b539c454","Type":"ContainerStarted","Data":"22a8893140e5893c9c6c085abbdd759cbcd7243bc67eb231cae92d4b3f54d4d2"} Apr 17 16:33:51.609026 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.608951 2576 generic.go:358] "Generic (PLEG): container finished" podID="bdecf6bd-ce4a-4be2-8117-e763391f52a2" containerID="1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50" exitCode=0 Apr 17 16:33:51.609179 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.609030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" event={"ID":"bdecf6bd-ce4a-4be2-8117-e763391f52a2","Type":"ContainerDied","Data":"1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50"} Apr 17 16:33:51.609179 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.609062 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" event={"ID":"bdecf6bd-ce4a-4be2-8117-e763391f52a2","Type":"ContainerDied","Data":"1af4f3e0f4d57f988d0278dc8fbe638b965456e51cd1f6735c7d30bcd0cd6829"} Apr 17 16:33:51.609179 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.609083 2576 scope.go:117] "RemoveContainer" containerID="1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50" Apr 17 16:33:51.609430 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.609218 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9bf8cf54-j9p76" Apr 17 16:33:51.617865 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.617825 2576 scope.go:117] "RemoveContainer" containerID="1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50" Apr 17 16:33:51.618136 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:51.618100 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50\": container with ID starting with 1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50 not found: ID does not exist" containerID="1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50" Apr 17 16:33:51.618229 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.618137 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50"} err="failed to get container status \"1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50\": rpc error: code = NotFound desc = could not find container \"1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50\": container with ID starting with 1ea63228e25c3c4f1219a5a85582638a6d1a3213f813f00eba93e89c566e7c50 not found: ID does not exist" Apr 17 16:33:51.620920 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.620880 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-t7l69" podStartSLOduration=2.755496192 podStartE2EDuration="3.620868311s" podCreationTimestamp="2026-04-17 16:33:48 +0000 UTC" firstStartedPulling="2026-04-17 16:33:49.045931205 +0000 UTC m=+148.511114045" lastFinishedPulling="2026-04-17 16:33:49.911303324 +0000 UTC m=+149.376486164" observedRunningTime="2026-04-17 16:33:51.619788341 +0000 UTC m=+151.084971203" watchObservedRunningTime="2026-04-17 16:33:51.620868311 +0000 UTC m=+151.086051172" Apr 17 16:33:51.645316 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.645295 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-installation-pull-secrets\") pod \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " Apr 17 16:33:51.645438 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.645357 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-trusted-ca\") pod \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " Apr 17 16:33:51.645438 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.645390 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-bound-sa-token\") pod \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " Apr 17 16:33:51.646514 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.645921 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bdecf6bd-ce4a-4be2-8117-e763391f52a2-ca-trust-extracted\") pod \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " Apr 17 16:33:51.646609 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.645852 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bdecf6bd-ce4a-4be2-8117-e763391f52a2" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:33:51.646997 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.646972 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bdecf6bd-ce4a-4be2-8117-e763391f52a2" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:33:51.647082 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.647016 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-certificates\") pod \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " Apr 17 16:33:51.647082 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.647066 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-image-registry-private-configuration\") pod \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " Apr 17 16:33:51.647442 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.647420 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls\") pod \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " Apr 17 16:33:51.647904 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.647864 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bdecf6bd-ce4a-4be2-8117-e763391f52a2" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:33:51.648404 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.648167 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mklph\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-kube-api-access-mklph\") pod \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\" (UID: \"bdecf6bd-ce4a-4be2-8117-e763391f52a2\") " Apr 17 16:33:51.648827 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.648804 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bdecf6bd-ce4a-4be2-8117-e763391f52a2" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:33:51.651130 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.651095 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bdecf6bd-ce4a-4be2-8117-e763391f52a2" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:33:51.652823 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.651307 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-kube-api-access-mklph" (OuterVolumeSpecName: "kube-api-access-mklph") pod "bdecf6bd-ce4a-4be2-8117-e763391f52a2" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2"). InnerVolumeSpecName "kube-api-access-mklph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:33:51.652823 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.651678 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-installation-pull-secrets\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:33:51.652823 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.651700 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-trusted-ca\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:33:51.652823 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.651715 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-bound-sa-token\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:33:51.652823 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.651731 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-certificates\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:33:51.653105 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.652979 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "bdecf6bd-ce4a-4be2-8117-e763391f52a2" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:33:51.653105 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.653048 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-tkskl" podStartSLOduration=2.231438985 podStartE2EDuration="3.653033085s" podCreationTimestamp="2026-04-17 16:33:48 +0000 UTC" firstStartedPulling="2026-04-17 16:33:49.865063718 +0000 UTC m=+149.330246558" lastFinishedPulling="2026-04-17 16:33:51.28665782 +0000 UTC m=+150.751840658" observedRunningTime="2026-04-17 16:33:51.652274708 +0000 UTC m=+151.117457569" watchObservedRunningTime="2026-04-17 16:33:51.653033085 +0000 UTC m=+151.118215947" Apr 17 16:33:51.657297 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.657271 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdecf6bd-ce4a-4be2-8117-e763391f52a2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bdecf6bd-ce4a-4be2-8117-e763391f52a2" (UID: "bdecf6bd-ce4a-4be2-8117-e763391f52a2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:33:51.752546 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.752488 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bdecf6bd-ce4a-4be2-8117-e763391f52a2-ca-trust-extracted\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:33:51.752693 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.752551 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bdecf6bd-ce4a-4be2-8117-e763391f52a2-image-registry-private-configuration\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:33:51.752693 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.752568 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-registry-tls\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:33:51.752693 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.752583 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mklph\" (UniqueName: \"kubernetes.io/projected/bdecf6bd-ce4a-4be2-8117-e763391f52a2-kube-api-access-mklph\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:33:51.943464 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.943435 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d9bf8cf54-j9p76"] Apr 17 16:33:51.946568 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:51.946543 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5d9bf8cf54-j9p76"] Apr 17 16:33:52.614507 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:52.614454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" event={"ID":"1c0c6bd4-7556-4400-b3cb-97a101645604","Type":"ContainerStarted","Data":"1dd532a1272018200808daf73f3b80644a73dde8b14fd0ceda6eb37b88f28315"} Apr 17 16:33:52.644745 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:52.644695 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ztfmg" podStartSLOduration=3.282097407 podStartE2EDuration="4.644679357s" podCreationTimestamp="2026-04-17 16:33:48 +0000 UTC" firstStartedPulling="2026-04-17 16:33:50.677322985 +0000 UTC m=+150.142505838" lastFinishedPulling="2026-04-17 16:33:52.039904946 +0000 UTC m=+151.505087788" observedRunningTime="2026-04-17 16:33:52.642527201 +0000 UTC m=+152.107710061" watchObservedRunningTime="2026-04-17 16:33:52.644679357 +0000 UTC m=+152.109862217" Apr 17 16:33:53.078161 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.078125 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7c9859c58c-vspmm"] Apr 17 16:33:53.078533 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.078510 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdecf6bd-ce4a-4be2-8117-e763391f52a2" containerName="registry" Apr 17 16:33:53.078533 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.078532 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdecf6bd-ce4a-4be2-8117-e763391f52a2" containerName="registry" Apr 17 16:33:53.078682 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.078594 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdecf6bd-ce4a-4be2-8117-e763391f52a2" containerName="registry" Apr 17 16:33:53.081769 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.081747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.084538 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.084513 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:33:53.084538 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.084525 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 16:33:53.084696 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.084543 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 16:33:53.085676 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.085642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-9bpk2\"" Apr 17 16:33:53.085844 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.085828 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 16:33:53.085921 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.085876 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9slu9i32rk266\"" Apr 17 16:33:53.094608 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.093150 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c9859c58c-vspmm"] Apr 17 16:33:53.143174 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.143137 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdecf6bd-ce4a-4be2-8117-e763391f52a2" path="/var/lib/kubelet/pods/bdecf6bd-ce4a-4be2-8117-e763391f52a2/volumes" Apr 17 16:33:53.164881 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.164851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw5b2\" (UniqueName: \"kubernetes.io/projected/d7174470-3052-4a86-bf2e-7b684aca84fe-kube-api-access-nw5b2\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.165044 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.164925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d7174470-3052-4a86-bf2e-7b684aca84fe-metrics-server-audit-profiles\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.165044 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.164977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d7174470-3052-4a86-bf2e-7b684aca84fe-secret-metrics-server-tls\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.165044 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.165002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7174470-3052-4a86-bf2e-7b684aca84fe-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.165044 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.165030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d7174470-3052-4a86-bf2e-7b684aca84fe-secret-metrics-server-client-certs\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.165219 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.165121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d7174470-3052-4a86-bf2e-7b684aca84fe-audit-log\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.165219 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.165173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7174470-3052-4a86-bf2e-7b684aca84fe-client-ca-bundle\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.266378 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.266344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d7174470-3052-4a86-bf2e-7b684aca84fe-metrics-server-audit-profiles\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.266560 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.266408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d7174470-3052-4a86-bf2e-7b684aca84fe-secret-metrics-server-tls\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.266560 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.266432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7174470-3052-4a86-bf2e-7b684aca84fe-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.266560 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.266455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d7174470-3052-4a86-bf2e-7b684aca84fe-secret-metrics-server-client-certs\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.266560 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.266545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d7174470-3052-4a86-bf2e-7b684aca84fe-audit-log\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.266759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.266571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7174470-3052-4a86-bf2e-7b684aca84fe-client-ca-bundle\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.266759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.266633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nw5b2\" (UniqueName: \"kubernetes.io/projected/d7174470-3052-4a86-bf2e-7b684aca84fe-kube-api-access-nw5b2\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.267467 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.267413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d7174470-3052-4a86-bf2e-7b684aca84fe-metrics-server-audit-profiles\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.267603 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.267570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d7174470-3052-4a86-bf2e-7b684aca84fe-audit-log\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.268085 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.268059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7174470-3052-4a86-bf2e-7b684aca84fe-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.270273 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.270247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7174470-3052-4a86-bf2e-7b684aca84fe-client-ca-bundle\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.270387 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.270285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d7174470-3052-4a86-bf2e-7b684aca84fe-secret-metrics-server-client-certs\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.270768 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.270747 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d7174470-3052-4a86-bf2e-7b684aca84fe-secret-metrics-server-tls\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.278489 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.278456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw5b2\" (UniqueName: \"kubernetes.io/projected/d7174470-3052-4a86-bf2e-7b684aca84fe-kube-api-access-nw5b2\") pod \"metrics-server-7c9859c58c-vspmm\" (UID: \"d7174470-3052-4a86-bf2e-7b684aca84fe\") " pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.294617 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.294583 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7"] Apr 17 16:33:53.299114 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.299092 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" Apr 17 16:33:53.302538 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.302514 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 16:33:53.302828 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.302807 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-559p9\"" Apr 17 16:33:53.308779 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.308755 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7"] Apr 17 16:33:53.367919 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.367836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ba63d4bf-a505-45d2-8a8b-b6f74ad23868-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bj7s7\" (UID: \"ba63d4bf-a505-45d2-8a8b-b6f74ad23868\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" Apr 17 16:33:53.395715 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.395663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:33:53.468796 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.468760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ba63d4bf-a505-45d2-8a8b-b6f74ad23868-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bj7s7\" (UID: \"ba63d4bf-a505-45d2-8a8b-b6f74ad23868\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" Apr 17 16:33:53.468952 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:53.468895 2576 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 16:33:53.469014 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:53.468966 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba63d4bf-a505-45d2-8a8b-b6f74ad23868-monitoring-plugin-cert podName:ba63d4bf-a505-45d2-8a8b-b6f74ad23868 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:53.968949227 +0000 UTC m=+153.434132066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/ba63d4bf-a505-45d2-8a8b-b6f74ad23868-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-bj7s7" (UID: "ba63d4bf-a505-45d2-8a8b-b6f74ad23868") : secret "monitoring-plugin-cert" not found Apr 17 16:33:53.573659 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.573630 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c9859c58c-vspmm"] Apr 17 16:33:53.576756 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:53.576728 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7174470_3052_4a86_bf2e_7b684aca84fe.slice/crio-132d476e8a90dbb93196f08d8cd36c8603daaab648a9f545bafb86c071a80762 WatchSource:0}: Error finding container 132d476e8a90dbb93196f08d8cd36c8603daaab648a9f545bafb86c071a80762: Status 404 returned error can't find the container with id 132d476e8a90dbb93196f08d8cd36c8603daaab648a9f545bafb86c071a80762 Apr 17 16:33:53.619601 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.619533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" event={"ID":"d7174470-3052-4a86-bf2e-7b684aca84fe","Type":"ContainerStarted","Data":"132d476e8a90dbb93196f08d8cd36c8603daaab648a9f545bafb86c071a80762"} Apr 17 16:33:53.621153 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.621108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" event={"ID":"75373c0d-030c-4f0d-beeb-fc38b539c454","Type":"ContainerStarted","Data":"ef0667aa98c0feda18acd34db48b732a09af70b7de3126271b0c2ff6636d08db"} Apr 17 16:33:53.973490 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.973452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ba63d4bf-a505-45d2-8a8b-b6f74ad23868-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bj7s7\" (UID: \"ba63d4bf-a505-45d2-8a8b-b6f74ad23868\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" Apr 17 16:33:53.975894 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:53.975873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ba63d4bf-a505-45d2-8a8b-b6f74ad23868-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bj7s7\" (UID: \"ba63d4bf-a505-45d2-8a8b-b6f74ad23868\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" Apr 17 16:33:54.209909 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.209868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" Apr 17 16:33:54.351984 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.351947 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7"] Apr 17 16:33:54.361431 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:54.361183 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba63d4bf_a505_45d2_8a8b_b6f74ad23868.slice/crio-27d6154b80951dbbdd9bc949a53c455c899fed551ae56e65b2d208a63ef6b5fc WatchSource:0}: Error finding container 27d6154b80951dbbdd9bc949a53c455c899fed551ae56e65b2d208a63ef6b5fc: Status 404 returned error can't find the container with id 27d6154b80951dbbdd9bc949a53c455c899fed551ae56e65b2d208a63ef6b5fc Apr 17 16:33:54.625708 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.625623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" event={"ID":"ba63d4bf-a505-45d2-8a8b-b6f74ad23868","Type":"ContainerStarted","Data":"27d6154b80951dbbdd9bc949a53c455c899fed551ae56e65b2d208a63ef6b5fc"} Apr 17 16:33:54.627716 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.627685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" event={"ID":"75373c0d-030c-4f0d-beeb-fc38b539c454","Type":"ContainerStarted","Data":"ebab896dc23f3e27c62660d9df0f9c36c83b62b6ff1e87f52b19bd8539defaf7"} Apr 17 16:33:54.627840 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.627718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" event={"ID":"75373c0d-030c-4f0d-beeb-fc38b539c454","Type":"ContainerStarted","Data":"1de45e00424788844b572dc4d875f0e433f0c796c8a7b0a3da3cf1d6894a9fce"} Apr 17 16:33:54.894220 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.894083 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:33:54.898600 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.898574 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.903114 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.903079 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 16:33:54.903244 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.903145 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 16:33:54.903426 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.903405 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 16:33:54.903570 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.903519 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-2ktmc\"" Apr 17 16:33:54.903630 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.903565 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 16:33:54.903800 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.903778 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 16:33:54.903983 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.903957 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 16:33:54.904067 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.904017 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-emith5b2q69lv\"" Apr 17 16:33:54.904277 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.904233 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 16:33:54.904277 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.904247 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 16:33:54.904428 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.904305 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 16:33:54.904428 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.904367 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 16:33:54.910169 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.910138 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 16:33:54.918804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.918779 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 16:33:54.919439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.919412 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:33:54.983152 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6bdf597-226a-42e5-a554-67152d8c8f03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983290 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983158 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983290 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983290 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983290 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983290 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-config\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a6bdf597-226a-42e5-a554-67152d8c8f03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tfzc\" (UniqueName: \"kubernetes.io/projected/a6bdf597-226a-42e5-a554-67152d8c8f03-kube-api-access-2tfzc\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6bdf597-226a-42e5-a554-67152d8c8f03-config-out\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-web-config\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983697 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983697 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983697 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983593 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:54.983697 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:54.983615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084199 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-web-config\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084313 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084313 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084313 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084471 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084315 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084471 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6bdf597-226a-42e5-a554-67152d8c8f03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084471 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084471 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084716 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084716 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084716 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-config\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084716 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a6bdf597-226a-42e5-a554-67152d8c8f03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084716 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084716 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.084716 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.085026 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tfzc\" (UniqueName: \"kubernetes.io/projected/a6bdf597-226a-42e5-a554-67152d8c8f03-kube-api-access-2tfzc\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.085026 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.085026 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.084818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6bdf597-226a-42e5-a554-67152d8c8f03-config-out\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.085184 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.085112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.085955 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.085631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.085955 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.085941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.086640 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.086616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.089452 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.088718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a6bdf597-226a-42e5-a554-67152d8c8f03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.089452 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.089424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a6bdf597-226a-42e5-a554-67152d8c8f03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.090652 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.090388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-config\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.090652 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.090612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.090817 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.090680 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.091804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.091779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6bdf597-226a-42e5-a554-67152d8c8f03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.091964 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.091946 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6bdf597-226a-42e5-a554-67152d8c8f03-config-out\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.092032 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.092011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.092105 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.092069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-web-config\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.092105 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.092096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.092524 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.092486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.092977 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.092954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.093709 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.093676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a6bdf597-226a-42e5-a554-67152d8c8f03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.100048 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.100028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tfzc\" (UniqueName: \"kubernetes.io/projected/a6bdf597-226a-42e5-a554-67152d8c8f03-kube-api-access-2tfzc\") pod \"prometheus-k8s-0\" (UID: \"a6bdf597-226a-42e5-a554-67152d8c8f03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.211172 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.211140 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:55.357784 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.357720 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:33:55.360911 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:33:55.360873 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6bdf597_226a_42e5_a554_67152d8c8f03.slice/crio-bbe3d78cdc575db98372986ea17cf055b689615d3fc62dc1a3dbde0019612f87 WatchSource:0}: Error finding container bbe3d78cdc575db98372986ea17cf055b689615d3fc62dc1a3dbde0019612f87: Status 404 returned error can't find the container with id bbe3d78cdc575db98372986ea17cf055b689615d3fc62dc1a3dbde0019612f87 Apr 17 16:33:55.634469 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.634429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" event={"ID":"75373c0d-030c-4f0d-beeb-fc38b539c454","Type":"ContainerStarted","Data":"18ed0bb1e4d9407e44b36e59dd4add04d03ce1d20cd0f88abde3ce9b94d2441c"} Apr 17 16:33:55.634937 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.634475 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" event={"ID":"75373c0d-030c-4f0d-beeb-fc38b539c454","Type":"ContainerStarted","Data":"7c56a9a198968659123efb8482c5a5f18ddd84ac16b3bc439f84154ea3a51f57"} Apr 17 16:33:55.634937 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.634514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" event={"ID":"75373c0d-030c-4f0d-beeb-fc38b539c454","Type":"ContainerStarted","Data":"433287ba1774f16680f0837875b6019b246b2d89accc556007717b51f15c2859"} Apr 17 16:33:55.634937 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.634629 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:33:55.636361 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.636332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" event={"ID":"d7174470-3052-4a86-bf2e-7b684aca84fe","Type":"ContainerStarted","Data":"027cf8c6156f0a0aaf1bd73d475173bac8172654c86dba69e17f2653bd69ea87"} Apr 17 16:33:55.637709 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.637677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a6bdf597-226a-42e5-a554-67152d8c8f03","Type":"ContainerStarted","Data":"bbe3d78cdc575db98372986ea17cf055b689615d3fc62dc1a3dbde0019612f87"} Apr 17 16:33:55.678446 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.678377 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" podStartSLOduration=2.077442442 podStartE2EDuration="5.678359561s" podCreationTimestamp="2026-04-17 16:33:50 +0000 UTC" firstStartedPulling="2026-04-17 16:33:51.403342598 +0000 UTC m=+150.868525436" lastFinishedPulling="2026-04-17 16:33:55.004259705 +0000 UTC m=+154.469442555" observedRunningTime="2026-04-17 16:33:55.678267553 +0000 UTC m=+155.143450416" watchObservedRunningTime="2026-04-17 16:33:55.678359561 +0000 UTC m=+155.143542424" Apr 17 16:33:55.716248 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:55.716186 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" podStartSLOduration=1.28781587 podStartE2EDuration="2.716165067s" podCreationTimestamp="2026-04-17 16:33:53 +0000 UTC" firstStartedPulling="2026-04-17 16:33:53.578700813 +0000 UTC m=+153.043883656" lastFinishedPulling="2026-04-17 16:33:55.007050011 +0000 UTC m=+154.472232853" observedRunningTime="2026-04-17 16:33:55.714177713 +0000 UTC m=+155.179485462" watchObservedRunningTime="2026-04-17 16:33:55.716165067 +0000 UTC m=+155.181347928" Apr 17 16:33:56.642676 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:56.642636 2576 generic.go:358] "Generic (PLEG): container finished" podID="a6bdf597-226a-42e5-a554-67152d8c8f03" containerID="5dca2ce220481242d71250762a9479df4e9603f7003b4df6cdea25c809cd9b8d" exitCode=0 Apr 17 16:33:56.643096 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:56.642778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a6bdf597-226a-42e5-a554-67152d8c8f03","Type":"ContainerDied","Data":"5dca2ce220481242d71250762a9479df4e9603f7003b4df6cdea25c809cd9b8d"} Apr 17 16:33:56.645373 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:56.645340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" event={"ID":"ba63d4bf-a505-45d2-8a8b-b6f74ad23868","Type":"ContainerStarted","Data":"f4433ce7fd6510c393dbbc9f0f9a354e940d6ca55b051d177fe9ea11b68421d1"} Apr 17 16:33:56.645838 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:56.645801 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" Apr 17 16:33:56.650530 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:56.650458 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" Apr 17 16:33:56.684822 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:56.684791 2576 patch_prober.go:28] interesting pod/image-registry-d78769859-djn2h container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:33:56.684967 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:56.684843 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-d78769859-djn2h" podUID="d1f4421a-5f1c-468c-9392-a80b17d865af" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:33:56.693924 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:56.693879 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bj7s7" podStartSLOduration=1.656804308 podStartE2EDuration="3.693865687s" podCreationTimestamp="2026-04-17 16:33:53 +0000 UTC" firstStartedPulling="2026-04-17 16:33:54.36390449 +0000 UTC m=+153.829087335" lastFinishedPulling="2026-04-17 16:33:56.400965862 +0000 UTC m=+155.866148714" observedRunningTime="2026-04-17 16:33:56.691978293 +0000 UTC m=+156.157161154" watchObservedRunningTime="2026-04-17 16:33:56.693865687 +0000 UTC m=+156.159048598" Apr 17 16:33:57.443044 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:57.443000 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-k5nlz" podUID="deee814d-1106-4662-bd60-3fc176424f4c" Apr 17 16:33:57.456200 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:33:57.456161 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9vg62" podUID="13a6aa7e-beeb-4695-9196-0c33a34de454" Apr 17 16:33:57.648606 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:57.648568 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:33:58.518732 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:58.518700 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-d78769859-djn2h" Apr 17 16:33:59.658336 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:59.658304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a6bdf597-226a-42e5-a554-67152d8c8f03","Type":"ContainerStarted","Data":"6282af43956e9132907ac7434180d307335e9ba8f916a7653149b240c6c94772"} Apr 17 16:33:59.658336 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:59.658338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a6bdf597-226a-42e5-a554-67152d8c8f03","Type":"ContainerStarted","Data":"ee67e5ac8d3997eb3114623a8abf7040c827a469b20548767d6abadeb677547b"} Apr 17 16:33:59.658757 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:59.658350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a6bdf597-226a-42e5-a554-67152d8c8f03","Type":"ContainerStarted","Data":"53e9202fe2e3ee26fbd9a2fbabddb0d69fb626f3ab24242ed3c79c93795019a6"} Apr 17 16:33:59.658757 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:59.658363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a6bdf597-226a-42e5-a554-67152d8c8f03","Type":"ContainerStarted","Data":"85e413f489b442473e91ba7af96e56520082967e284d8d9700c60bd20aa134f4"} Apr 17 16:33:59.658757 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:59.658375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a6bdf597-226a-42e5-a554-67152d8c8f03","Type":"ContainerStarted","Data":"0306d8bf974d8f3c1720232536ae25599159f36f01469fb0f6c64ed5834d897c"} Apr 17 16:33:59.658757 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:59.658387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a6bdf597-226a-42e5-a554-67152d8c8f03","Type":"ContainerStarted","Data":"81c102d1c6fd19b8c81690d659992476531f2c0724ce3dee49382d7906106081"} Apr 17 16:33:59.689920 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:33:59.689862 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.856239255 podStartE2EDuration="5.689842696s" podCreationTimestamp="2026-04-17 16:33:54 +0000 UTC" firstStartedPulling="2026-04-17 16:33:55.363640865 +0000 UTC m=+154.828823708" lastFinishedPulling="2026-04-17 16:33:59.197244308 +0000 UTC m=+158.662427149" observedRunningTime="2026-04-17 16:33:59.687695879 +0000 UTC m=+159.152878740" watchObservedRunningTime="2026-04-17 16:33:59.689842696 +0000 UTC m=+159.155025558" Apr 17 16:34:00.211262 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:00.211226 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:01.651694 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:01.651667 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7d7d8b6597-4hx86" Apr 17 16:34:02.355616 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:02.355579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:34:02.355831 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:02.355642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:34:02.358050 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:02.358025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13a6aa7e-beeb-4695-9196-0c33a34de454-metrics-tls\") pod \"dns-default-9vg62\" (UID: \"13a6aa7e-beeb-4695-9196-0c33a34de454\") " pod="openshift-dns/dns-default-9vg62" Apr 17 16:34:02.358166 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:02.358061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/deee814d-1106-4662-bd60-3fc176424f4c-cert\") pod \"ingress-canary-k5nlz\" (UID: \"deee814d-1106-4662-bd60-3fc176424f4c\") " pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:34:02.452141 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:02.452111 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qsxk4\"" Apr 17 16:34:02.460323 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:02.460299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k5nlz" Apr 17 16:34:02.579635 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:02.579613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k5nlz"] Apr 17 16:34:02.582006 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:34:02.581968 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeee814d_1106_4662_bd60_3fc176424f4c.slice/crio-c2ae448864030989adc3592a8e1881357fd08f1ccf5e1e766506473426b71c45 WatchSource:0}: Error finding container c2ae448864030989adc3592a8e1881357fd08f1ccf5e1e766506473426b71c45: Status 404 returned error can't find the container with id c2ae448864030989adc3592a8e1881357fd08f1ccf5e1e766506473426b71c45 Apr 17 16:34:02.667350 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:02.667314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k5nlz" event={"ID":"deee814d-1106-4662-bd60-3fc176424f4c","Type":"ContainerStarted","Data":"c2ae448864030989adc3592a8e1881357fd08f1ccf5e1e766506473426b71c45"} Apr 17 16:34:04.674298 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:04.674266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k5nlz" event={"ID":"deee814d-1106-4662-bd60-3fc176424f4c","Type":"ContainerStarted","Data":"8c4045364aed8b00fa80854f8900999ff7fc1aa8bc325ca606cf8f2adcf7992b"} Apr 17 16:34:04.691541 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:04.691436 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k5nlz" podStartSLOduration=129.270225782 podStartE2EDuration="2m10.69141731s" podCreationTimestamp="2026-04-17 16:31:54 +0000 UTC" firstStartedPulling="2026-04-17 16:34:02.583931346 +0000 UTC m=+162.049114188" lastFinishedPulling="2026-04-17 16:34:04.005122872 +0000 UTC m=+163.470305716" observedRunningTime="2026-04-17 16:34:04.690318891 +0000 UTC m=+164.155501751" watchObservedRunningTime="2026-04-17 16:34:04.69141731 +0000 UTC m=+164.156600170" Apr 17 16:34:11.141975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:11.141906 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9vg62" Apr 17 16:34:11.144704 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:11.144685 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-78gjs\"" Apr 17 16:34:11.153102 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:11.153084 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9vg62" Apr 17 16:34:11.274480 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:11.274452 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9vg62"] Apr 17 16:34:11.276579 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:34:11.276549 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13a6aa7e_beeb_4695_9196_0c33a34de454.slice/crio-157cad0824b107366debf20ec4462b4c9a679c76dfabbeeb229d2a27f18214a4 WatchSource:0}: Error finding container 157cad0824b107366debf20ec4462b4c9a679c76dfabbeeb229d2a27f18214a4: Status 404 returned error can't find the container with id 157cad0824b107366debf20ec4462b4c9a679c76dfabbeeb229d2a27f18214a4 Apr 17 16:34:11.696940 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:11.696896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9vg62" event={"ID":"13a6aa7e-beeb-4695-9196-0c33a34de454","Type":"ContainerStarted","Data":"157cad0824b107366debf20ec4462b4c9a679c76dfabbeeb229d2a27f18214a4"} Apr 17 16:34:12.701066 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:12.700982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9vg62" event={"ID":"13a6aa7e-beeb-4695-9196-0c33a34de454","Type":"ContainerStarted","Data":"5d922b9ad0b254f1cbc966892b0e256cee5212df40f9d882469ced271a93b382"} Apr 17 16:34:12.701066 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:12.701019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9vg62" event={"ID":"13a6aa7e-beeb-4695-9196-0c33a34de454","Type":"ContainerStarted","Data":"7a77b6748113c1fe4fe83a7097b96792c03b0106dd1e0f777dd4b83367d13453"} Apr 17 16:34:12.701487 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:12.701184 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9vg62" Apr 17 16:34:12.721977 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:12.721928 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9vg62" podStartSLOduration=137.548442904 podStartE2EDuration="2m18.721911264s" podCreationTimestamp="2026-04-17 16:31:54 +0000 UTC" firstStartedPulling="2026-04-17 16:34:11.278526552 +0000 UTC m=+170.743709394" lastFinishedPulling="2026-04-17 16:34:12.451994898 +0000 UTC m=+171.917177754" observedRunningTime="2026-04-17 16:34:12.720618209 +0000 UTC m=+172.185801069" watchObservedRunningTime="2026-04-17 16:34:12.721911264 +0000 UTC m=+172.187094161" Apr 17 16:34:13.395761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:13.395733 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:34:13.395761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:13.395772 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:34:22.706195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:22.706165 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9vg62" Apr 17 16:34:33.401961 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:33.401926 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:34:33.405775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:33.405751 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7c9859c58c-vspmm" Apr 17 16:34:34.769275 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:34.769240 2576 generic.go:358] "Generic (PLEG): container finished" podID="90c4edf8-ec15-4ea5-b1c6-ede355520321" containerID="90e84b2413b48eb00e5b6c852781097bf65a40ca238bce273a64f73209606055" exitCode=0 Apr 17 16:34:34.769677 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:34.769309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" event={"ID":"90c4edf8-ec15-4ea5-b1c6-ede355520321","Type":"ContainerDied","Data":"90e84b2413b48eb00e5b6c852781097bf65a40ca238bce273a64f73209606055"} Apr 17 16:34:34.769677 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:34.769638 2576 scope.go:117] "RemoveContainer" containerID="90e84b2413b48eb00e5b6c852781097bf65a40ca238bce273a64f73209606055" Apr 17 16:34:35.775342 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:35.775308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ctq4m" event={"ID":"90c4edf8-ec15-4ea5-b1c6-ede355520321","Type":"ContainerStarted","Data":"168b4ef9a2e55eb61b820a34ebbe7294a4acc1ff2d3d41c5ccd0543ea81e42c6"} Apr 17 16:34:55.211985 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:55.211941 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:55.228071 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:55.228042 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:55.850161 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:34:55.850133 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:51.523777 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.523742 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-snx5l"] Apr 17 16:35:51.527334 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.527312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.529860 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.529838 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:35:51.545993 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.545969 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-snx5l"] Apr 17 16:35:51.568285 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.568253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ecab24eb-d60d-4495-8c68-2f114d0a30f9-kubelet-config\") pod \"global-pull-secret-syncer-snx5l\" (UID: \"ecab24eb-d60d-4495-8c68-2f114d0a30f9\") " pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.568410 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.568307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ecab24eb-d60d-4495-8c68-2f114d0a30f9-original-pull-secret\") pod \"global-pull-secret-syncer-snx5l\" (UID: \"ecab24eb-d60d-4495-8c68-2f114d0a30f9\") " pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.568410 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.568396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ecab24eb-d60d-4495-8c68-2f114d0a30f9-dbus\") pod \"global-pull-secret-syncer-snx5l\" (UID: \"ecab24eb-d60d-4495-8c68-2f114d0a30f9\") " pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.668830 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.668790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ecab24eb-d60d-4495-8c68-2f114d0a30f9-original-pull-secret\") pod \"global-pull-secret-syncer-snx5l\" (UID: \"ecab24eb-d60d-4495-8c68-2f114d0a30f9\") " pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.669029 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.668852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ecab24eb-d60d-4495-8c68-2f114d0a30f9-dbus\") pod \"global-pull-secret-syncer-snx5l\" (UID: \"ecab24eb-d60d-4495-8c68-2f114d0a30f9\") " pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.669029 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.668919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ecab24eb-d60d-4495-8c68-2f114d0a30f9-kubelet-config\") pod \"global-pull-secret-syncer-snx5l\" (UID: \"ecab24eb-d60d-4495-8c68-2f114d0a30f9\") " pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.669029 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.669010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ecab24eb-d60d-4495-8c68-2f114d0a30f9-kubelet-config\") pod \"global-pull-secret-syncer-snx5l\" (UID: \"ecab24eb-d60d-4495-8c68-2f114d0a30f9\") " pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.669197 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.669043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ecab24eb-d60d-4495-8c68-2f114d0a30f9-dbus\") pod \"global-pull-secret-syncer-snx5l\" (UID: \"ecab24eb-d60d-4495-8c68-2f114d0a30f9\") " pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.670994 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.670970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ecab24eb-d60d-4495-8c68-2f114d0a30f9-original-pull-secret\") pod \"global-pull-secret-syncer-snx5l\" (UID: \"ecab24eb-d60d-4495-8c68-2f114d0a30f9\") " pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.836094 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.836003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snx5l" Apr 17 16:35:51.956792 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:51.956593 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-snx5l"] Apr 17 16:35:51.959781 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:35:51.959755 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecab24eb_d60d_4495_8c68_2f114d0a30f9.slice/crio-104f8ff594dd40158eb5a0d82147085b626aea1ee7a9fce639be16e71d02cea2 WatchSource:0}: Error finding container 104f8ff594dd40158eb5a0d82147085b626aea1ee7a9fce639be16e71d02cea2: Status 404 returned error can't find the container with id 104f8ff594dd40158eb5a0d82147085b626aea1ee7a9fce639be16e71d02cea2 Apr 17 16:35:52.009669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:52.009635 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-snx5l" event={"ID":"ecab24eb-d60d-4495-8c68-2f114d0a30f9","Type":"ContainerStarted","Data":"104f8ff594dd40158eb5a0d82147085b626aea1ee7a9fce639be16e71d02cea2"} Apr 17 16:35:56.026097 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:56.026053 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-snx5l" event={"ID":"ecab24eb-d60d-4495-8c68-2f114d0a30f9","Type":"ContainerStarted","Data":"eaeb4817556279c6aadf2774abf27465c677aef2cdabba271a429e23bdfa1fb7"} Apr 17 16:35:56.041514 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:35:56.041442 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-snx5l" podStartSLOduration=1.12122647 podStartE2EDuration="5.041428802s" podCreationTimestamp="2026-04-17 16:35:51 +0000 UTC" firstStartedPulling="2026-04-17 16:35:51.96179132 +0000 UTC m=+271.426974159" lastFinishedPulling="2026-04-17 16:35:55.881993652 +0000 UTC m=+275.347176491" observedRunningTime="2026-04-17 16:35:56.040806001 +0000 UTC m=+275.505988862" watchObservedRunningTime="2026-04-17 16:35:56.041428802 +0000 UTC m=+275.506611694" Apr 17 16:36:21.034290 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:36:21.034254 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:36:21.034952 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:36:21.034929 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:36:21.048664 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:36:21.048635 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:38:34.537879 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.537831 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss"] Apr 17 16:38:34.540275 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.540258 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.556104 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.556073 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-rdcg4\"" Apr 17 16:38:34.556104 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.556092 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 16:38:34.556317 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.556120 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 16:38:34.556317 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.556074 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:38:34.556317 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.556126 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 16:38:34.557205 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.557187 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 16:38:34.575840 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.575813 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss"] Apr 17 16:38:34.629976 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.629938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd251306-7990-42ff-a8aa-3d288f473b17-cert\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.629976 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.629975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49dbk\" (UniqueName: \"kubernetes.io/projected/fd251306-7990-42ff-a8aa-3d288f473b17-kube-api-access-49dbk\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.630177 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.630064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fd251306-7990-42ff-a8aa-3d288f473b17-manager-config\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.630177 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.630104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd251306-7990-42ff-a8aa-3d288f473b17-metrics-cert\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.731088 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.731041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd251306-7990-42ff-a8aa-3d288f473b17-cert\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.731266 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.731095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49dbk\" (UniqueName: \"kubernetes.io/projected/fd251306-7990-42ff-a8aa-3d288f473b17-kube-api-access-49dbk\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.731266 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.731164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fd251306-7990-42ff-a8aa-3d288f473b17-manager-config\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.731266 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.731201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd251306-7990-42ff-a8aa-3d288f473b17-metrics-cert\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.731890 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.731862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fd251306-7990-42ff-a8aa-3d288f473b17-manager-config\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.733750 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.733721 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd251306-7990-42ff-a8aa-3d288f473b17-metrics-cert\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.733895 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.733874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd251306-7990-42ff-a8aa-3d288f473b17-cert\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.743694 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.743667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49dbk\" (UniqueName: \"kubernetes.io/projected/fd251306-7990-42ff-a8aa-3d288f473b17-kube-api-access-49dbk\") pod \"lws-controller-manager-5f68f6fcb9-kvwss\" (UID: \"fd251306-7990-42ff-a8aa-3d288f473b17\") " pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.849795 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.849710 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:34.997933 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:38:34.997889 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd251306_7990_42ff_a8aa_3d288f473b17.slice/crio-ed03e9e18b20222fdd1c0aa24e01cd976d6375f2369deeb854d092bd729ae74b WatchSource:0}: Error finding container ed03e9e18b20222fdd1c0aa24e01cd976d6375f2369deeb854d092bd729ae74b: Status 404 returned error can't find the container with id ed03e9e18b20222fdd1c0aa24e01cd976d6375f2369deeb854d092bd729ae74b Apr 17 16:38:34.999797 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:34.999776 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:38:35.000273 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:35.000171 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss"] Apr 17 16:38:35.504279 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:35.504200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" event={"ID":"fd251306-7990-42ff-a8aa-3d288f473b17","Type":"ContainerStarted","Data":"ed03e9e18b20222fdd1c0aa24e01cd976d6375f2369deeb854d092bd729ae74b"} Apr 17 16:38:37.513890 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:37.513849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" event={"ID":"fd251306-7990-42ff-a8aa-3d288f473b17","Type":"ContainerStarted","Data":"724e871615cd9a5eafa431ed651e55f6791cd67b3c28ce35ec85d4f48dea988a"} Apr 17 16:38:37.514248 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:37.513929 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:37.537671 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:37.537570 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" podStartSLOduration=1.249478956 podStartE2EDuration="3.537554793s" podCreationTimestamp="2026-04-17 16:38:34 +0000 UTC" firstStartedPulling="2026-04-17 16:38:34.999908377 +0000 UTC m=+434.465091226" lastFinishedPulling="2026-04-17 16:38:37.28798422 +0000 UTC m=+436.753167063" observedRunningTime="2026-04-17 16:38:37.536212846 +0000 UTC m=+437.001395708" watchObservedRunningTime="2026-04-17 16:38:37.537554793 +0000 UTC m=+437.002737654" Apr 17 16:38:48.519552 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:48.519525 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5f68f6fcb9-kvwss" Apr 17 16:38:56.605282 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.605245 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7"] Apr 17 16:38:56.608140 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.608118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.620945 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.620902 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-r8c55\"" Apr 17 16:38:56.621069 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.621009 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 16:38:56.636207 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.636173 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7"] Apr 17 16:38:56.725961 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.725927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.726129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.725968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.726129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.726044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.726129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.726088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.726129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.726110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.726129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.726126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0f143604-a24b-451b-8dff-83c446e9941f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.726315 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.726162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0f143604-a24b-451b-8dff-83c446e9941f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.726315 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.726185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0f143604-a24b-451b-8dff-83c446e9941f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.726315 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.726211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxlws\" (UniqueName: \"kubernetes.io/projected/0f143604-a24b-451b-8dff-83c446e9941f-kube-api-access-rxlws\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.826947 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.826913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.826947 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.826952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827192 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.826971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0f143604-a24b-451b-8dff-83c446e9941f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827192 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.826994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0f143604-a24b-451b-8dff-83c446e9941f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827192 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.827014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0f143604-a24b-451b-8dff-83c446e9941f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827192 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.827173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxlws\" (UniqueName: \"kubernetes.io/projected/0f143604-a24b-451b-8dff-83c446e9941f-kube-api-access-rxlws\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827349 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.827265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827349 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.827310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827435 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.827360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827435 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.827388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827564 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.827309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827681 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.827658 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827745 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.827658 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.827951 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.827932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0f143604-a24b-451b-8dff-83c446e9941f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.829448 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.829419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0f143604-a24b-451b-8dff-83c446e9941f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.829580 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.829553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0f143604-a24b-451b-8dff-83c446e9941f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.836511 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.836443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxlws\" (UniqueName: \"kubernetes.io/projected/0f143604-a24b-451b-8dff-83c446e9941f-kube-api-access-rxlws\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.836900 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.836875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0f143604-a24b-451b-8dff-83c446e9941f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-rzkt7\" (UID: \"0f143604-a24b-451b-8dff-83c446e9941f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:56.918598 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:56.918562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:38:57.045613 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:57.045439 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7"] Apr 17 16:38:57.048776 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:38:57.048743 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f143604_a24b_451b_8dff_83c446e9941f.slice/crio-ad93385d8a28c2fc66d7fff60a51d69bb9f8a5cf22385178319510500ccaf676 WatchSource:0}: Error finding container ad93385d8a28c2fc66d7fff60a51d69bb9f8a5cf22385178319510500ccaf676: Status 404 returned error can't find the container with id ad93385d8a28c2fc66d7fff60a51d69bb9f8a5cf22385178319510500ccaf676 Apr 17 16:38:57.575473 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:38:57.575435 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" event={"ID":"0f143604-a24b-451b-8dff-83c446e9941f","Type":"ContainerStarted","Data":"ad93385d8a28c2fc66d7fff60a51d69bb9f8a5cf22385178319510500ccaf676"} Apr 17 16:39:13.285391 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:13.285355 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:39:13.285699 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:13.285426 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:39:13.285699 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:13.285456 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:39:13.631491 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:13.631409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" event={"ID":"0f143604-a24b-451b-8dff-83c446e9941f","Type":"ContainerStarted","Data":"72602d0d90145f1e393143eed349a105a1c80cc748d7ba7787bda89ad7f8e2db"} Apr 17 16:39:13.661961 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:13.661893 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" podStartSLOduration=1.427468157 podStartE2EDuration="17.661873732s" podCreationTimestamp="2026-04-17 16:38:56 +0000 UTC" firstStartedPulling="2026-04-17 16:38:57.050685104 +0000 UTC m=+456.515867948" lastFinishedPulling="2026-04-17 16:39:13.285090665 +0000 UTC m=+472.750273523" observedRunningTime="2026-04-17 16:39:13.654111 +0000 UTC m=+473.119293871" watchObservedRunningTime="2026-04-17 16:39:13.661873732 +0000 UTC m=+473.127056593" Apr 17 16:39:13.919341 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:13.919309 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:39:14.923304 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:14.923274 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:39:15.638411 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:15.638380 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:39:15.639258 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:15.639240 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-rzkt7" Apr 17 16:39:37.431788 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.431752 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5"] Apr 17 16:39:37.434107 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.434087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" Apr 17 16:39:37.436700 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.436679 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 16:39:37.436794 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.436679 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 16:39:37.437285 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.437268 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-qcrlv\"" Apr 17 16:39:37.448581 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.448552 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5"] Apr 17 16:39:37.479015 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.478981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmdz7\" (UniqueName: \"kubernetes.io/projected/54d5dc22-bd5b-4ebf-a65b-b25604ea654b-kube-api-access-rmdz7\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-wzmd5\" (UID: \"54d5dc22-bd5b-4ebf-a65b-b25604ea654b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" Apr 17 16:39:37.479173 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.479078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/54d5dc22-bd5b-4ebf-a65b-b25604ea654b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-wzmd5\" (UID: \"54d5dc22-bd5b-4ebf-a65b-b25604ea654b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" Apr 17 16:39:37.580211 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.580176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmdz7\" (UniqueName: \"kubernetes.io/projected/54d5dc22-bd5b-4ebf-a65b-b25604ea654b-kube-api-access-rmdz7\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-wzmd5\" (UID: \"54d5dc22-bd5b-4ebf-a65b-b25604ea654b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" Apr 17 16:39:37.580378 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.580254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/54d5dc22-bd5b-4ebf-a65b-b25604ea654b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-wzmd5\" (UID: \"54d5dc22-bd5b-4ebf-a65b-b25604ea654b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" Apr 17 16:39:37.580663 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.580645 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/54d5dc22-bd5b-4ebf-a65b-b25604ea654b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-wzmd5\" (UID: \"54d5dc22-bd5b-4ebf-a65b-b25604ea654b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" Apr 17 16:39:37.588766 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.588742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmdz7\" (UniqueName: \"kubernetes.io/projected/54d5dc22-bd5b-4ebf-a65b-b25604ea654b-kube-api-access-rmdz7\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-wzmd5\" (UID: \"54d5dc22-bd5b-4ebf-a65b-b25604ea654b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" Apr 17 16:39:37.744800 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.744714 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" Apr 17 16:39:37.896200 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:37.896164 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5"] Apr 17 16:39:37.900513 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:39:37.900464 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d5dc22_bd5b_4ebf_a65b_b25604ea654b.slice/crio-c1f26a8002116b33217b4ceb02ef17c2b1ea83e453f1746af0021f2a7eb26dbe WatchSource:0}: Error finding container c1f26a8002116b33217b4ceb02ef17c2b1ea83e453f1746af0021f2a7eb26dbe: Status 404 returned error can't find the container with id c1f26a8002116b33217b4ceb02ef17c2b1ea83e453f1746af0021f2a7eb26dbe Apr 17 16:39:38.709176 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:38.709132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" event={"ID":"54d5dc22-bd5b-4ebf-a65b-b25604ea654b","Type":"ContainerStarted","Data":"c1f26a8002116b33217b4ceb02ef17c2b1ea83e453f1746af0021f2a7eb26dbe"} Apr 17 16:39:42.725525 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:42.725472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" event={"ID":"54d5dc22-bd5b-4ebf-a65b-b25604ea654b","Type":"ContainerStarted","Data":"eca35ab4ff45c8caac5dceab267a4cee4e2ea448f6b6c5c425aab20c9b80a82b"} Apr 17 16:39:42.725921 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:42.725576 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" Apr 17 16:39:42.748976 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:42.748924 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" podStartSLOduration=1.787686449 podStartE2EDuration="5.748911122s" podCreationTimestamp="2026-04-17 16:39:37 +0000 UTC" firstStartedPulling="2026-04-17 16:39:37.902876326 +0000 UTC m=+497.368059166" lastFinishedPulling="2026-04-17 16:39:41.864100988 +0000 UTC m=+501.329283839" observedRunningTime="2026-04-17 16:39:42.746718588 +0000 UTC m=+502.211901451" watchObservedRunningTime="2026-04-17 16:39:42.748911122 +0000 UTC m=+502.214093982" Apr 17 16:39:53.731933 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:39:53.731903 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-wzmd5" Apr 17 16:40:29.371300 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.371217 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-ff9c5"] Apr 17 16:40:29.379132 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.379096 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ff9c5" Apr 17 16:40:29.384019 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.382430 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-cld7r\"" Apr 17 16:40:29.387643 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.387614 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ff9c5"] Apr 17 16:40:29.432713 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.432674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2mht\" (UniqueName: \"kubernetes.io/projected/bc40074f-6c9b-44a1-b2ba-8f9733a9d881-kube-api-access-w2mht\") pod \"authorino-674b59b84c-ff9c5\" (UID: \"bc40074f-6c9b-44a1-b2ba-8f9733a9d881\") " pod="kuadrant-system/authorino-674b59b84c-ff9c5" Apr 17 16:40:29.533708 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.533667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2mht\" (UniqueName: \"kubernetes.io/projected/bc40074f-6c9b-44a1-b2ba-8f9733a9d881-kube-api-access-w2mht\") pod \"authorino-674b59b84c-ff9c5\" (UID: \"bc40074f-6c9b-44a1-b2ba-8f9733a9d881\") " pod="kuadrant-system/authorino-674b59b84c-ff9c5" Apr 17 16:40:29.542119 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.542088 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2mht\" (UniqueName: \"kubernetes.io/projected/bc40074f-6c9b-44a1-b2ba-8f9733a9d881-kube-api-access-w2mht\") pod \"authorino-674b59b84c-ff9c5\" (UID: \"bc40074f-6c9b-44a1-b2ba-8f9733a9d881\") " pod="kuadrant-system/authorino-674b59b84c-ff9c5" Apr 17 16:40:29.588391 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.588341 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-nbxw5"] Apr 17 16:40:29.591922 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.591906 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-nbxw5" Apr 17 16:40:29.597252 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.597229 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-nbxw5"] Apr 17 16:40:29.635109 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.635036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xglmp\" (UniqueName: \"kubernetes.io/projected/c7153d50-6e58-4e74-9713-04b0b8c9e6cf-kube-api-access-xglmp\") pod \"authorino-79cbc94b89-nbxw5\" (UID: \"c7153d50-6e58-4e74-9713-04b0b8c9e6cf\") " pod="kuadrant-system/authorino-79cbc94b89-nbxw5" Apr 17 16:40:29.705612 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.705575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ff9c5" Apr 17 16:40:29.735917 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.735883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xglmp\" (UniqueName: \"kubernetes.io/projected/c7153d50-6e58-4e74-9713-04b0b8c9e6cf-kube-api-access-xglmp\") pod \"authorino-79cbc94b89-nbxw5\" (UID: \"c7153d50-6e58-4e74-9713-04b0b8c9e6cf\") " pod="kuadrant-system/authorino-79cbc94b89-nbxw5" Apr 17 16:40:29.743388 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.743361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xglmp\" (UniqueName: \"kubernetes.io/projected/c7153d50-6e58-4e74-9713-04b0b8c9e6cf-kube-api-access-xglmp\") pod \"authorino-79cbc94b89-nbxw5\" (UID: \"c7153d50-6e58-4e74-9713-04b0b8c9e6cf\") " pod="kuadrant-system/authorino-79cbc94b89-nbxw5" Apr 17 16:40:29.847630 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.847603 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ff9c5"] Apr 17 16:40:29.849988 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:40:29.849961 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc40074f_6c9b_44a1_b2ba_8f9733a9d881.slice/crio-ed1175fa61d41cd094c2dcb7294536fcdcc015ea0c3492ab0d4d7421b041abf6 WatchSource:0}: Error finding container ed1175fa61d41cd094c2dcb7294536fcdcc015ea0c3492ab0d4d7421b041abf6: Status 404 returned error can't find the container with id ed1175fa61d41cd094c2dcb7294536fcdcc015ea0c3492ab0d4d7421b041abf6 Apr 17 16:40:29.877342 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.877310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ff9c5" event={"ID":"bc40074f-6c9b-44a1-b2ba-8f9733a9d881","Type":"ContainerStarted","Data":"ed1175fa61d41cd094c2dcb7294536fcdcc015ea0c3492ab0d4d7421b041abf6"} Apr 17 16:40:29.901547 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:29.901515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-nbxw5" Apr 17 16:40:30.019481 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:30.019455 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-nbxw5"] Apr 17 16:40:30.021709 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:40:30.021681 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7153d50_6e58_4e74_9713_04b0b8c9e6cf.slice/crio-7e8645430a55ee12460094e51557b7c27c862e8b8006f5a8298c460450854937 WatchSource:0}: Error finding container 7e8645430a55ee12460094e51557b7c27c862e8b8006f5a8298c460450854937: Status 404 returned error can't find the container with id 7e8645430a55ee12460094e51557b7c27c862e8b8006f5a8298c460450854937 Apr 17 16:40:30.883458 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:30.883418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-nbxw5" event={"ID":"c7153d50-6e58-4e74-9713-04b0b8c9e6cf","Type":"ContainerStarted","Data":"7e8645430a55ee12460094e51557b7c27c862e8b8006f5a8298c460450854937"} Apr 17 16:40:31.887918 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:31.887812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ff9c5" event={"ID":"bc40074f-6c9b-44a1-b2ba-8f9733a9d881","Type":"ContainerStarted","Data":"da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee"} Apr 17 16:40:31.889126 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:31.889104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-nbxw5" event={"ID":"c7153d50-6e58-4e74-9713-04b0b8c9e6cf","Type":"ContainerStarted","Data":"d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02"} Apr 17 16:40:31.903331 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:31.903279 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-ff9c5" podStartSLOduration=1.200882696 podStartE2EDuration="2.903265073s" podCreationTimestamp="2026-04-17 16:40:29 +0000 UTC" firstStartedPulling="2026-04-17 16:40:29.851591529 +0000 UTC m=+549.316774369" lastFinishedPulling="2026-04-17 16:40:31.553973902 +0000 UTC m=+551.019156746" observedRunningTime="2026-04-17 16:40:31.901186126 +0000 UTC m=+551.366368987" watchObservedRunningTime="2026-04-17 16:40:31.903265073 +0000 UTC m=+551.368447934" Apr 17 16:40:31.916618 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:31.916568 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-nbxw5" podStartSLOduration=1.389784501 podStartE2EDuration="2.916550499s" podCreationTimestamp="2026-04-17 16:40:29 +0000 UTC" firstStartedPulling="2026-04-17 16:40:30.022907812 +0000 UTC m=+549.488090650" lastFinishedPulling="2026-04-17 16:40:31.549673798 +0000 UTC m=+551.014856648" observedRunningTime="2026-04-17 16:40:31.914518821 +0000 UTC m=+551.379701684" watchObservedRunningTime="2026-04-17 16:40:31.916550499 +0000 UTC m=+551.381733362" Apr 17 16:40:31.939264 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:31.939229 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ff9c5"] Apr 17 16:40:33.895832 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:33.895795 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-ff9c5" podUID="bc40074f-6c9b-44a1-b2ba-8f9733a9d881" containerName="authorino" containerID="cri-o://da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee" gracePeriod=30 Apr 17 16:40:34.133520 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.133471 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ff9c5" Apr 17 16:40:34.178638 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.178604 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2mht\" (UniqueName: \"kubernetes.io/projected/bc40074f-6c9b-44a1-b2ba-8f9733a9d881-kube-api-access-w2mht\") pod \"bc40074f-6c9b-44a1-b2ba-8f9733a9d881\" (UID: \"bc40074f-6c9b-44a1-b2ba-8f9733a9d881\") " Apr 17 16:40:34.180810 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.180769 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc40074f-6c9b-44a1-b2ba-8f9733a9d881-kube-api-access-w2mht" (OuterVolumeSpecName: "kube-api-access-w2mht") pod "bc40074f-6c9b-44a1-b2ba-8f9733a9d881" (UID: "bc40074f-6c9b-44a1-b2ba-8f9733a9d881"). InnerVolumeSpecName "kube-api-access-w2mht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:40:34.279858 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.279826 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2mht\" (UniqueName: \"kubernetes.io/projected/bc40074f-6c9b-44a1-b2ba-8f9733a9d881-kube-api-access-w2mht\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:40:34.900183 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.900140 2576 generic.go:358] "Generic (PLEG): container finished" podID="bc40074f-6c9b-44a1-b2ba-8f9733a9d881" containerID="da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee" exitCode=0 Apr 17 16:40:34.900615 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.900192 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ff9c5" Apr 17 16:40:34.900615 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.900227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ff9c5" event={"ID":"bc40074f-6c9b-44a1-b2ba-8f9733a9d881","Type":"ContainerDied","Data":"da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee"} Apr 17 16:40:34.900615 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.900269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ff9c5" event={"ID":"bc40074f-6c9b-44a1-b2ba-8f9733a9d881","Type":"ContainerDied","Data":"ed1175fa61d41cd094c2dcb7294536fcdcc015ea0c3492ab0d4d7421b041abf6"} Apr 17 16:40:34.900615 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.900288 2576 scope.go:117] "RemoveContainer" containerID="da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee" Apr 17 16:40:34.908722 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.908701 2576 scope.go:117] "RemoveContainer" containerID="da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee" Apr 17 16:40:34.908957 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:40:34.908937 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee\": container with ID starting with da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee not found: ID does not exist" containerID="da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee" Apr 17 16:40:34.909012 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.908968 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee"} err="failed to get container status \"da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee\": rpc error: code = NotFound desc = could not find container \"da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee\": container with ID starting with da4b73b2a99dbf4936eef52808d037095ee259362a2aaa2a9013d929f0d4d8ee not found: ID does not exist" Apr 17 16:40:34.921245 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.921217 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ff9c5"] Apr 17 16:40:34.925010 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:34.924982 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ff9c5"] Apr 17 16:40:35.142303 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:35.142271 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc40074f-6c9b-44a1-b2ba-8f9733a9d881" path="/var/lib/kubelet/pods/bc40074f-6c9b-44a1-b2ba-8f9733a9d881/volumes" Apr 17 16:40:54.337056 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.337018 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-nbxw5"] Apr 17 16:40:54.337783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.337747 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-nbxw5" podUID="c7153d50-6e58-4e74-9713-04b0b8c9e6cf" containerName="authorino" containerID="cri-o://d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02" gracePeriod=30 Apr 17 16:40:54.577065 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.577042 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-nbxw5" Apr 17 16:40:54.762701 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.762666 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xglmp\" (UniqueName: \"kubernetes.io/projected/c7153d50-6e58-4e74-9713-04b0b8c9e6cf-kube-api-access-xglmp\") pod \"c7153d50-6e58-4e74-9713-04b0b8c9e6cf\" (UID: \"c7153d50-6e58-4e74-9713-04b0b8c9e6cf\") " Apr 17 16:40:54.764860 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.764832 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7153d50-6e58-4e74-9713-04b0b8c9e6cf-kube-api-access-xglmp" (OuterVolumeSpecName: "kube-api-access-xglmp") pod "c7153d50-6e58-4e74-9713-04b0b8c9e6cf" (UID: "c7153d50-6e58-4e74-9713-04b0b8c9e6cf"). InnerVolumeSpecName "kube-api-access-xglmp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:40:54.863804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.863770 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xglmp\" (UniqueName: \"kubernetes.io/projected/c7153d50-6e58-4e74-9713-04b0b8c9e6cf-kube-api-access-xglmp\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:40:54.971420 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.971387 2576 generic.go:358] "Generic (PLEG): container finished" podID="c7153d50-6e58-4e74-9713-04b0b8c9e6cf" containerID="d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02" exitCode=0 Apr 17 16:40:54.971604 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.971433 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-nbxw5" Apr 17 16:40:54.971604 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.971470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-nbxw5" event={"ID":"c7153d50-6e58-4e74-9713-04b0b8c9e6cf","Type":"ContainerDied","Data":"d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02"} Apr 17 16:40:54.971604 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.971521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-nbxw5" event={"ID":"c7153d50-6e58-4e74-9713-04b0b8c9e6cf","Type":"ContainerDied","Data":"7e8645430a55ee12460094e51557b7c27c862e8b8006f5a8298c460450854937"} Apr 17 16:40:54.971604 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.971542 2576 scope.go:117] "RemoveContainer" containerID="d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02" Apr 17 16:40:54.979971 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.979948 2576 scope.go:117] "RemoveContainer" containerID="d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02" Apr 17 16:40:54.980201 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:40:54.980184 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02\": container with ID starting with d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02 not found: ID does not exist" containerID="d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02" Apr 17 16:40:54.980250 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.980212 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02"} err="failed to get container status \"d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02\": rpc error: code = NotFound desc = could not find container \"d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02\": container with ID starting with d67831e8a6a32ed321b6f42ea5402ebb85d77e20c2de098d32486183d4a49e02 not found: ID does not exist" Apr 17 16:40:54.991391 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:54.991368 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-nbxw5"] Apr 17 16:40:55.000387 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:55.000366 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-nbxw5"] Apr 17 16:40:55.142827 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:40:55.142750 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7153d50-6e58-4e74-9713-04b0b8c9e6cf" path="/var/lib/kubelet/pods/c7153d50-6e58-4e74-9713-04b0b8c9e6cf/volumes" Apr 17 16:41:10.368989 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.368955 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-2tzmq"] Apr 17 16:41:10.369448 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.369353 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc40074f-6c9b-44a1-b2ba-8f9733a9d881" containerName="authorino" Apr 17 16:41:10.369448 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.369365 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc40074f-6c9b-44a1-b2ba-8f9733a9d881" containerName="authorino" Apr 17 16:41:10.369448 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.369381 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7153d50-6e58-4e74-9713-04b0b8c9e6cf" containerName="authorino" Apr 17 16:41:10.369448 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.369386 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7153d50-6e58-4e74-9713-04b0b8c9e6cf" containerName="authorino" Apr 17 16:41:10.369448 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.369441 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc40074f-6c9b-44a1-b2ba-8f9733a9d881" containerName="authorino" Apr 17 16:41:10.369622 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.369453 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7153d50-6e58-4e74-9713-04b0b8c9e6cf" containerName="authorino" Apr 17 16:41:10.372219 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.372203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:10.374757 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.374725 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 16:41:10.374889 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.374813 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 16:41:10.374940 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.374900 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-bhr4j\"" Apr 17 16:41:10.374989 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.374966 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 16:41:10.383818 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.383793 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-2tzmq"] Apr 17 16:41:10.403927 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.403889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-cert\") pod \"kserve-controller-manager-85bb65f8c4-2tzmq\" (UID: \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\") " pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:10.404102 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.403945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6stn\" (UniqueName: \"kubernetes.io/projected/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-kube-api-access-c6stn\") pod \"kserve-controller-manager-85bb65f8c4-2tzmq\" (UID: \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\") " pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:10.423259 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.423217 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-fgpbh"] Apr 17 16:41:10.427129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.427103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-fgpbh" Apr 17 16:41:10.429635 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.429615 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 16:41:10.429738 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.429651 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-lwlxq\"" Apr 17 16:41:10.438528 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.438487 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-fgpbh"] Apr 17 16:41:10.504509 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.504453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b018a95-ab05-4f83-af69-a729e521004f-data\") pod \"seaweedfs-86cc847c5c-fgpbh\" (UID: \"1b018a95-ab05-4f83-af69-a729e521004f\") " pod="kserve/seaweedfs-86cc847c5c-fgpbh" Apr 17 16:41:10.504710 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.504519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-cert\") pod \"kserve-controller-manager-85bb65f8c4-2tzmq\" (UID: \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\") " pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:10.504710 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.504582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6stn\" (UniqueName: \"kubernetes.io/projected/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-kube-api-access-c6stn\") pod \"kserve-controller-manager-85bb65f8c4-2tzmq\" (UID: \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\") " pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:10.504710 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:41:10.504619 2576 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 16:41:10.504710 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.504648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c5w4\" (UniqueName: \"kubernetes.io/projected/1b018a95-ab05-4f83-af69-a729e521004f-kube-api-access-5c5w4\") pod \"seaweedfs-86cc847c5c-fgpbh\" (UID: \"1b018a95-ab05-4f83-af69-a729e521004f\") " pod="kserve/seaweedfs-86cc847c5c-fgpbh" Apr 17 16:41:10.504710 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:41:10.504676 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-cert podName:afbd20eb-5bb2-409b-91b8-819a55c6ab1b nodeName:}" failed. No retries permitted until 2026-04-17 16:41:11.004654745 +0000 UTC m=+590.469837584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-cert") pod "kserve-controller-manager-85bb65f8c4-2tzmq" (UID: "afbd20eb-5bb2-409b-91b8-819a55c6ab1b") : secret "kserve-webhook-server-cert" not found Apr 17 16:41:10.514182 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.514153 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6stn\" (UniqueName: \"kubernetes.io/projected/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-kube-api-access-c6stn\") pod \"kserve-controller-manager-85bb65f8c4-2tzmq\" (UID: \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\") " pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:10.606753 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.606131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5c5w4\" (UniqueName: \"kubernetes.io/projected/1b018a95-ab05-4f83-af69-a729e521004f-kube-api-access-5c5w4\") pod \"seaweedfs-86cc847c5c-fgpbh\" (UID: \"1b018a95-ab05-4f83-af69-a729e521004f\") " pod="kserve/seaweedfs-86cc847c5c-fgpbh" Apr 17 16:41:10.606753 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.606221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b018a95-ab05-4f83-af69-a729e521004f-data\") pod \"seaweedfs-86cc847c5c-fgpbh\" (UID: \"1b018a95-ab05-4f83-af69-a729e521004f\") " pod="kserve/seaweedfs-86cc847c5c-fgpbh" Apr 17 16:41:10.606753 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.606706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1b018a95-ab05-4f83-af69-a729e521004f-data\") pod \"seaweedfs-86cc847c5c-fgpbh\" (UID: \"1b018a95-ab05-4f83-af69-a729e521004f\") " pod="kserve/seaweedfs-86cc847c5c-fgpbh" Apr 17 16:41:10.616742 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.616717 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c5w4\" (UniqueName: \"kubernetes.io/projected/1b018a95-ab05-4f83-af69-a729e521004f-kube-api-access-5c5w4\") pod \"seaweedfs-86cc847c5c-fgpbh\" (UID: \"1b018a95-ab05-4f83-af69-a729e521004f\") " pod="kserve/seaweedfs-86cc847c5c-fgpbh" Apr 17 16:41:10.736432 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.736382 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-fgpbh" Apr 17 16:41:10.857010 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:10.856943 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-fgpbh"] Apr 17 16:41:10.859721 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:41:10.859693 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b018a95_ab05_4f83_af69_a729e521004f.slice/crio-c58d3bf124f22350710f29fa13602bf61656a1ba971efcfba924926d5f091314 WatchSource:0}: Error finding container c58d3bf124f22350710f29fa13602bf61656a1ba971efcfba924926d5f091314: Status 404 returned error can't find the container with id c58d3bf124f22350710f29fa13602bf61656a1ba971efcfba924926d5f091314 Apr 17 16:41:11.010812 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:11.010709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-cert\") pod \"kserve-controller-manager-85bb65f8c4-2tzmq\" (UID: \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\") " pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:11.013061 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:11.013042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-cert\") pod \"kserve-controller-manager-85bb65f8c4-2tzmq\" (UID: \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\") " pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:11.026930 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:11.026889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-fgpbh" event={"ID":"1b018a95-ab05-4f83-af69-a729e521004f","Type":"ContainerStarted","Data":"c58d3bf124f22350710f29fa13602bf61656a1ba971efcfba924926d5f091314"} Apr 17 16:41:11.283292 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:11.283217 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:11.514860 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:11.514827 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-2tzmq"] Apr 17 16:41:11.517443 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:41:11.517407 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafbd20eb_5bb2_409b_91b8_819a55c6ab1b.slice/crio-73987f0a8331b9c04f98a53d2a11735f802d8f8bc2295e128f4d7037d0b104d7 WatchSource:0}: Error finding container 73987f0a8331b9c04f98a53d2a11735f802d8f8bc2295e128f4d7037d0b104d7: Status 404 returned error can't find the container with id 73987f0a8331b9c04f98a53d2a11735f802d8f8bc2295e128f4d7037d0b104d7 Apr 17 16:41:12.034075 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:12.033998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" event={"ID":"afbd20eb-5bb2-409b-91b8-819a55c6ab1b","Type":"ContainerStarted","Data":"73987f0a8331b9c04f98a53d2a11735f802d8f8bc2295e128f4d7037d0b104d7"} Apr 17 16:41:15.047375 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:15.047292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" event={"ID":"afbd20eb-5bb2-409b-91b8-819a55c6ab1b","Type":"ContainerStarted","Data":"326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52"} Apr 17 16:41:15.047777 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:15.047410 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:15.048803 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:15.048777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-fgpbh" event={"ID":"1b018a95-ab05-4f83-af69-a729e521004f","Type":"ContainerStarted","Data":"26c4ddb38c427b5bb226b2c67e54bdb60bcf6453ded2f20e3d53427a281a6fc7"} Apr 17 16:41:15.048971 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:15.048953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-fgpbh" Apr 17 16:41:15.064707 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:15.064656 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" podStartSLOduration=1.7204991330000001 podStartE2EDuration="5.064643457s" podCreationTimestamp="2026-04-17 16:41:10 +0000 UTC" firstStartedPulling="2026-04-17 16:41:11.519177214 +0000 UTC m=+590.984360059" lastFinishedPulling="2026-04-17 16:41:14.863321345 +0000 UTC m=+594.328504383" observedRunningTime="2026-04-17 16:41:15.062806817 +0000 UTC m=+594.527989672" watchObservedRunningTime="2026-04-17 16:41:15.064643457 +0000 UTC m=+594.529826318" Apr 17 16:41:15.080824 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:15.080773 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-fgpbh" podStartSLOduration=1.019494029 podStartE2EDuration="5.080756772s" podCreationTimestamp="2026-04-17 16:41:10 +0000 UTC" firstStartedPulling="2026-04-17 16:41:10.861109656 +0000 UTC m=+590.326292507" lastFinishedPulling="2026-04-17 16:41:14.922372397 +0000 UTC m=+594.387555250" observedRunningTime="2026-04-17 16:41:15.079245698 +0000 UTC m=+594.544428561" watchObservedRunningTime="2026-04-17 16:41:15.080756772 +0000 UTC m=+594.545939630" Apr 17 16:41:21.054129 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:21.054096 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-fgpbh" Apr 17 16:41:21.069484 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:21.069463 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:41:21.069961 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:21.069944 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:41:46.057431 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:46.057394 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:47.658592 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.658553 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-2tzmq"] Apr 17 16:41:47.659002 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.658786 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" podUID="afbd20eb-5bb2-409b-91b8-819a55c6ab1b" containerName="manager" containerID="cri-o://326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52" gracePeriod=10 Apr 17 16:41:47.685672 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.685645 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-pxv2p"] Apr 17 16:41:47.689240 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.689223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" Apr 17 16:41:47.695192 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.695168 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-pxv2p"] Apr 17 16:41:47.844627 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.844597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl787\" (UniqueName: \"kubernetes.io/projected/1e07626f-ef69-4fac-934a-f33712143ec4-kube-api-access-vl787\") pod \"kserve-controller-manager-85bb65f8c4-pxv2p\" (UID: \"1e07626f-ef69-4fac-934a-f33712143ec4\") " pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" Apr 17 16:41:47.844791 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.844647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e07626f-ef69-4fac-934a-f33712143ec4-cert\") pod \"kserve-controller-manager-85bb65f8c4-pxv2p\" (UID: \"1e07626f-ef69-4fac-934a-f33712143ec4\") " pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" Apr 17 16:41:47.896708 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.896685 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:47.945690 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.945606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl787\" (UniqueName: \"kubernetes.io/projected/1e07626f-ef69-4fac-934a-f33712143ec4-kube-api-access-vl787\") pod \"kserve-controller-manager-85bb65f8c4-pxv2p\" (UID: \"1e07626f-ef69-4fac-934a-f33712143ec4\") " pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" Apr 17 16:41:47.945690 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.945663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e07626f-ef69-4fac-934a-f33712143ec4-cert\") pod \"kserve-controller-manager-85bb65f8c4-pxv2p\" (UID: \"1e07626f-ef69-4fac-934a-f33712143ec4\") " pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" Apr 17 16:41:47.948011 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.947987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e07626f-ef69-4fac-934a-f33712143ec4-cert\") pod \"kserve-controller-manager-85bb65f8c4-pxv2p\" (UID: \"1e07626f-ef69-4fac-934a-f33712143ec4\") " pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" Apr 17 16:41:47.954003 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:47.953976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl787\" (UniqueName: \"kubernetes.io/projected/1e07626f-ef69-4fac-934a-f33712143ec4-kube-api-access-vl787\") pod \"kserve-controller-manager-85bb65f8c4-pxv2p\" (UID: \"1e07626f-ef69-4fac-934a-f33712143ec4\") " pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" Apr 17 16:41:48.036727 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.036691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" Apr 17 16:41:48.046658 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.046633 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6stn\" (UniqueName: \"kubernetes.io/projected/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-kube-api-access-c6stn\") pod \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\" (UID: \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\") " Apr 17 16:41:48.046760 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.046727 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-cert\") pod \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\" (UID: \"afbd20eb-5bb2-409b-91b8-819a55c6ab1b\") " Apr 17 16:41:48.048785 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.048758 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-cert" (OuterVolumeSpecName: "cert") pod "afbd20eb-5bb2-409b-91b8-819a55c6ab1b" (UID: "afbd20eb-5bb2-409b-91b8-819a55c6ab1b"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:41:48.049190 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.049156 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-kube-api-access-c6stn" (OuterVolumeSpecName: "kube-api-access-c6stn") pod "afbd20eb-5bb2-409b-91b8-819a55c6ab1b" (UID: "afbd20eb-5bb2-409b-91b8-819a55c6ab1b"). InnerVolumeSpecName "kube-api-access-c6stn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:41:48.148063 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.148033 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c6stn\" (UniqueName: \"kubernetes.io/projected/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-kube-api-access-c6stn\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:41:48.148063 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.148063 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afbd20eb-5bb2-409b-91b8-819a55c6ab1b-cert\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:41:48.158840 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.158814 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-pxv2p"] Apr 17 16:41:48.160876 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:41:48.160847 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e07626f_ef69_4fac_934a_f33712143ec4.slice/crio-4a96ed36433a66fa3a0b04613c94110dc8820839f2d19571ebb79b5acec40e6b WatchSource:0}: Error finding container 4a96ed36433a66fa3a0b04613c94110dc8820839f2d19571ebb79b5acec40e6b: Status 404 returned error can't find the container with id 4a96ed36433a66fa3a0b04613c94110dc8820839f2d19571ebb79b5acec40e6b Apr 17 16:41:48.165189 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.165164 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" event={"ID":"1e07626f-ef69-4fac-934a-f33712143ec4","Type":"ContainerStarted","Data":"4a96ed36433a66fa3a0b04613c94110dc8820839f2d19571ebb79b5acec40e6b"} Apr 17 16:41:48.166336 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.166312 2576 generic.go:358] "Generic (PLEG): container finished" podID="afbd20eb-5bb2-409b-91b8-819a55c6ab1b" containerID="326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52" exitCode=0 Apr 17 16:41:48.166439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.166349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" event={"ID":"afbd20eb-5bb2-409b-91b8-819a55c6ab1b","Type":"ContainerDied","Data":"326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52"} Apr 17 16:41:48.166439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.166368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" event={"ID":"afbd20eb-5bb2-409b-91b8-819a55c6ab1b","Type":"ContainerDied","Data":"73987f0a8331b9c04f98a53d2a11735f802d8f8bc2295e128f4d7037d0b104d7"} Apr 17 16:41:48.166439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.166378 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-2tzmq" Apr 17 16:41:48.166439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.166389 2576 scope.go:117] "RemoveContainer" containerID="326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52" Apr 17 16:41:48.175232 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.175218 2576 scope.go:117] "RemoveContainer" containerID="326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52" Apr 17 16:41:48.175574 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:41:48.175484 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52\": container with ID starting with 326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52 not found: ID does not exist" containerID="326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52" Apr 17 16:41:48.175574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.175548 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52"} err="failed to get container status \"326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52\": rpc error: code = NotFound desc = could not find container \"326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52\": container with ID starting with 326d96081b92f7e7f3a60e5fbe4df7b837a45b0f5ca41fc6dc5c8644c81fbf52 not found: ID does not exist" Apr 17 16:41:48.188012 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.187988 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-2tzmq"] Apr 17 16:41:48.194029 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:48.194007 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-2tzmq"] Apr 17 16:41:49.143295 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:49.143260 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afbd20eb-5bb2-409b-91b8-819a55c6ab1b" path="/var/lib/kubelet/pods/afbd20eb-5bb2-409b-91b8-819a55c6ab1b/volumes" Apr 17 16:41:49.171984 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:49.171947 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" event={"ID":"1e07626f-ef69-4fac-934a-f33712143ec4","Type":"ContainerStarted","Data":"665c2856f942bf3be13b1a8b5b0fb2e9637c89fad4f3ee3bf829e2dd212e072f"} Apr 17 16:41:49.172153 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:49.172007 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" Apr 17 16:41:49.188269 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:41:49.188215 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" podStartSLOduration=1.869153571 podStartE2EDuration="2.1882013s" podCreationTimestamp="2026-04-17 16:41:47 +0000 UTC" firstStartedPulling="2026-04-17 16:41:48.162168411 +0000 UTC m=+627.627351259" lastFinishedPulling="2026-04-17 16:41:48.481216134 +0000 UTC m=+627.946398988" observedRunningTime="2026-04-17 16:41:49.187570508 +0000 UTC m=+628.652753372" watchObservedRunningTime="2026-04-17 16:41:49.1882013 +0000 UTC m=+628.653384161" Apr 17 16:42:20.180117 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:20.180085 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85bb65f8c4-pxv2p" Apr 17 16:42:21.058627 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.058598 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-jgrx2"] Apr 17 16:42:21.059008 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.058992 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afbd20eb-5bb2-409b-91b8-819a55c6ab1b" containerName="manager" Apr 17 16:42:21.059050 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.059011 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbd20eb-5bb2-409b-91b8-819a55c6ab1b" containerName="manager" Apr 17 16:42:21.059096 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.059086 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="afbd20eb-5bb2-409b-91b8-819a55c6ab1b" containerName="manager" Apr 17 16:42:21.062071 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.062052 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-jgrx2" Apr 17 16:42:21.065603 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.065578 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-pk829\"" Apr 17 16:42:21.065603 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.065600 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 16:42:21.071938 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.071910 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-jgrx2"] Apr 17 16:42:21.126322 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.126295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd7mm\" (UniqueName: \"kubernetes.io/projected/f9d8c8af-6615-4eb8-9536-0e74a266f1f1-kube-api-access-qd7mm\") pod \"odh-model-controller-696fc77849-jgrx2\" (UID: \"f9d8c8af-6615-4eb8-9536-0e74a266f1f1\") " pod="kserve/odh-model-controller-696fc77849-jgrx2" Apr 17 16:42:21.126448 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.126371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9d8c8af-6615-4eb8-9536-0e74a266f1f1-cert\") pod \"odh-model-controller-696fc77849-jgrx2\" (UID: \"f9d8c8af-6615-4eb8-9536-0e74a266f1f1\") " pod="kserve/odh-model-controller-696fc77849-jgrx2" Apr 17 16:42:21.227434 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.227400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd7mm\" (UniqueName: \"kubernetes.io/projected/f9d8c8af-6615-4eb8-9536-0e74a266f1f1-kube-api-access-qd7mm\") pod \"odh-model-controller-696fc77849-jgrx2\" (UID: \"f9d8c8af-6615-4eb8-9536-0e74a266f1f1\") " pod="kserve/odh-model-controller-696fc77849-jgrx2" Apr 17 16:42:21.227851 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.227451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9d8c8af-6615-4eb8-9536-0e74a266f1f1-cert\") pod \"odh-model-controller-696fc77849-jgrx2\" (UID: \"f9d8c8af-6615-4eb8-9536-0e74a266f1f1\") " pod="kserve/odh-model-controller-696fc77849-jgrx2" Apr 17 16:42:21.230052 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.230027 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9d8c8af-6615-4eb8-9536-0e74a266f1f1-cert\") pod \"odh-model-controller-696fc77849-jgrx2\" (UID: \"f9d8c8af-6615-4eb8-9536-0e74a266f1f1\") " pod="kserve/odh-model-controller-696fc77849-jgrx2" Apr 17 16:42:21.235172 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.235149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd7mm\" (UniqueName: \"kubernetes.io/projected/f9d8c8af-6615-4eb8-9536-0e74a266f1f1-kube-api-access-qd7mm\") pod \"odh-model-controller-696fc77849-jgrx2\" (UID: \"f9d8c8af-6615-4eb8-9536-0e74a266f1f1\") " pod="kserve/odh-model-controller-696fc77849-jgrx2" Apr 17 16:42:21.374758 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.374673 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-jgrx2" Apr 17 16:42:21.514287 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:21.514255 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-jgrx2"] Apr 17 16:42:21.515808 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:42:21.515758 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9d8c8af_6615_4eb8_9536_0e74a266f1f1.slice/crio-306c76ef7fce9cb056c7603d9f45293d890739d2748c5ec62d27ea75be35137d WatchSource:0}: Error finding container 306c76ef7fce9cb056c7603d9f45293d890739d2748c5ec62d27ea75be35137d: Status 404 returned error can't find the container with id 306c76ef7fce9cb056c7603d9f45293d890739d2748c5ec62d27ea75be35137d Apr 17 16:42:22.282802 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:22.282765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-jgrx2" event={"ID":"f9d8c8af-6615-4eb8-9536-0e74a266f1f1","Type":"ContainerStarted","Data":"306c76ef7fce9cb056c7603d9f45293d890739d2748c5ec62d27ea75be35137d"} Apr 17 16:42:24.290935 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:24.290894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-jgrx2" event={"ID":"f9d8c8af-6615-4eb8-9536-0e74a266f1f1","Type":"ContainerStarted","Data":"a8aceaed3fcab15ce22ff0bc206070468c9a3738d0de1e5ac7dffcd1b48b4f11"} Apr 17 16:42:24.291325 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:24.290963 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-jgrx2" Apr 17 16:42:24.307702 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:24.307649 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-jgrx2" podStartSLOduration=0.949154531 podStartE2EDuration="3.30763684s" podCreationTimestamp="2026-04-17 16:42:21 +0000 UTC" firstStartedPulling="2026-04-17 16:42:21.517442069 +0000 UTC m=+660.982624915" lastFinishedPulling="2026-04-17 16:42:23.875924385 +0000 UTC m=+663.341107224" observedRunningTime="2026-04-17 16:42:24.306678994 +0000 UTC m=+663.771861856" watchObservedRunningTime="2026-04-17 16:42:24.30763684 +0000 UTC m=+663.772819700" Apr 17 16:42:35.297640 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:35.297609 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-jgrx2" Apr 17 16:42:56.396061 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.396019 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8"] Apr 17 16:42:56.400330 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.400306 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.402661 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.402633 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 17 16:42:56.402770 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.402666 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:42:56.402770 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.402726 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-487zr\"" Apr 17 16:42:56.402869 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.402839 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:42:56.411023 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.410998 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8"] Apr 17 16:42:56.453147 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.453107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.453147 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.453154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.453416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.453241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.453416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.453311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxx2m\" (UniqueName: \"kubernetes.io/projected/faf927d0-b1ed-4cf1-87fe-cefd8c511440-kube-api-access-fxx2m\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.453416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.453346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.453416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.453381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.453593 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.453453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.453593 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.453470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.453593 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.453489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554460 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554460 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554460 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554460 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554460 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxx2m\" (UniqueName: \"kubernetes.io/projected/faf927d0-b1ed-4cf1-87fe-cefd8c511440-kube-api-access-fxx2m\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554460 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554460 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554894 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.554894 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.555063 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.554966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.555118 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.555085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.555278 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.555261 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.557249 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.557217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.557362 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.557331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.562767 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.562738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/faf927d0-b1ed-4cf1-87fe-cefd8c511440-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.562925 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.562904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxx2m\" (UniqueName: \"kubernetes.io/projected/faf927d0-b1ed-4cf1-87fe-cefd8c511440-kube-api-access-fxx2m\") pod \"router-gateway-1-openshift-default-6c59fbf55c-lp2s8\" (UID: \"faf927d0-b1ed-4cf1-87fe-cefd8c511440\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.713842 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.713805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:56.841185 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.841154 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8"] Apr 17 16:42:56.844082 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:42:56.844051 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf927d0_b1ed_4cf1_87fe_cefd8c511440.slice/crio-98c373f47d4a506d9b8da8c02d7036928e289075ff847f87ef6d13927fcc0939 WatchSource:0}: Error finding container 98c373f47d4a506d9b8da8c02d7036928e289075ff847f87ef6d13927fcc0939: Status 404 returned error can't find the container with id 98c373f47d4a506d9b8da8c02d7036928e289075ff847f87ef6d13927fcc0939 Apr 17 16:42:56.847105 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.846655 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:42:56.847105 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.846740 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:42:56.847105 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:56.846787 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:42:57.405427 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:57.405390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" event={"ID":"faf927d0-b1ed-4cf1-87fe-cefd8c511440","Type":"ContainerStarted","Data":"bf356d1640a40881b48bf4add6aff5fa5668a40abde8ee9a2f7b58a0964940c4"} Apr 17 16:42:57.405427 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:57.405428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" event={"ID":"faf927d0-b1ed-4cf1-87fe-cefd8c511440","Type":"ContainerStarted","Data":"98c373f47d4a506d9b8da8c02d7036928e289075ff847f87ef6d13927fcc0939"} Apr 17 16:42:57.427426 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:57.427374 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" podStartSLOduration=1.427360004 podStartE2EDuration="1.427360004s" podCreationTimestamp="2026-04-17 16:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:42:57.424123528 +0000 UTC m=+696.889306391" watchObservedRunningTime="2026-04-17 16:42:57.427360004 +0000 UTC m=+696.892542866" Apr 17 16:42:57.714600 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:57.714567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:57.719192 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:57.719165 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:58.409572 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:58.409540 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:42:58.410725 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:42:58.410707 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-lp2s8" Apr 17 16:43:08.475615 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.475529 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5"] Apr 17 16:43:08.479232 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.479206 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.482789 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.482764 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 17 16:43:08.482912 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.482807 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:43:08.490094 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.490067 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5"] Apr 17 16:43:08.575469 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.575422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.575469 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.575471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.575681 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.575529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.575681 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.575603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnkxg\" (UniqueName: \"kubernetes.io/projected/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kube-api-access-nnkxg\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.575681 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.575656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.575799 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.575776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.676845 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.676804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.676845 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.676843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.677070 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.676867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.677070 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.676992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnkxg\" (UniqueName: \"kubernetes.io/projected/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kube-api-access-nnkxg\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.677070 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.677043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.677227 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.677127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.677284 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.677255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.677359 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.677322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.677462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.677441 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.679158 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.679123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.679471 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.679452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.685125 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.685107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnkxg\" (UniqueName: \"kubernetes.io/projected/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kube-api-access-nnkxg\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.792600 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.792478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:08.933569 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:08.933533 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5"] Apr 17 16:43:08.936681 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:43:08.936647 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b789eac_b7f2_4dc7_b92d_c1bed098d4a0.slice/crio-2bdf5a5aecf714a98564376c4529f2a4de46773fd67ddcff88af8bf1d4ac9f8d WatchSource:0}: Error finding container 2bdf5a5aecf714a98564376c4529f2a4de46773fd67ddcff88af8bf1d4ac9f8d: Status 404 returned error can't find the container with id 2bdf5a5aecf714a98564376c4529f2a4de46773fd67ddcff88af8bf1d4ac9f8d Apr 17 16:43:09.450142 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:09.450097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" event={"ID":"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0","Type":"ContainerStarted","Data":"2bdf5a5aecf714a98564376c4529f2a4de46773fd67ddcff88af8bf1d4ac9f8d"} Apr 17 16:43:13.471565 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:13.471529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" event={"ID":"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0","Type":"ContainerStarted","Data":"66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c"} Apr 17 16:43:17.486566 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:17.486530 2576 generic.go:358] "Generic (PLEG): container finished" podID="9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" containerID="66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c" exitCode=0 Apr 17 16:43:17.486937 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:17.486604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" event={"ID":"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0","Type":"ContainerDied","Data":"66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c"} Apr 17 16:43:19.494854 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:19.494819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" event={"ID":"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0","Type":"ContainerStarted","Data":"342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8"} Apr 17 16:43:19.513588 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:19.513537 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" podStartSLOduration=1.907705144 podStartE2EDuration="11.513522507s" podCreationTimestamp="2026-04-17 16:43:08 +0000 UTC" firstStartedPulling="2026-04-17 16:43:08.938557766 +0000 UTC m=+708.403740607" lastFinishedPulling="2026-04-17 16:43:18.544375128 +0000 UTC m=+718.009557970" observedRunningTime="2026-04-17 16:43:19.511364457 +0000 UTC m=+718.976547314" watchObservedRunningTime="2026-04-17 16:43:19.513522507 +0000 UTC m=+718.978705367" Apr 17 16:43:28.793611 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:28.793573 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:28.794097 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:28.793622 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:28.806555 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:28.806534 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:43:29.537889 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:43:29.537860 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:44:29.222758 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.222719 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h"] Apr 17 16:44:29.226621 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.226600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.229227 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.229204 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-fqv9r\"" Apr 17 16:44:29.229351 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.229317 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 17 16:44:29.238041 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.238020 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h"] Apr 17 16:44:29.346332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.346295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.346543 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.346351 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.346543 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.346385 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.346543 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.346431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.346543 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.346523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5jdz\" (UniqueName: \"kubernetes.io/projected/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kube-api-access-z5jdz\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.346692 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.346567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.447917 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.447876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.448114 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.447943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5jdz\" (UniqueName: \"kubernetes.io/projected/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kube-api-access-z5jdz\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.448114 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.447989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.448114 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.448039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.448114 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.448088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.448324 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.448120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.448383 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.448326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.448437 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.448396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.448527 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.448431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.448596 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.448491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.450721 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.450700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.455745 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.455728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5jdz\" (UniqueName: \"kubernetes.io/projected/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kube-api-access-z5jdz\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.537969 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.537875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:44:29.671582 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.671555 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h"] Apr 17 16:44:29.673654 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:44:29.673625 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04855268_e070_4a32_bdb9_1d7af0b4d8cd.slice/crio-5a6d7b5cc6c9582baddf1ba380285b139292cfeaf6fd854e03aefec231bf2beb WatchSource:0}: Error finding container 5a6d7b5cc6c9582baddf1ba380285b139292cfeaf6fd854e03aefec231bf2beb: Status 404 returned error can't find the container with id 5a6d7b5cc6c9582baddf1ba380285b139292cfeaf6fd854e03aefec231bf2beb Apr 17 16:44:29.675432 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.675414 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:44:29.731027 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:29.731001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" event={"ID":"04855268-e070-4a32-bdb9-1d7af0b4d8cd","Type":"ContainerStarted","Data":"5a6d7b5cc6c9582baddf1ba380285b139292cfeaf6fd854e03aefec231bf2beb"} Apr 17 16:44:30.736030 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:30.735941 2576 generic.go:358] "Generic (PLEG): container finished" podID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerID="ff57497df483f4fbf00262c71495dd8ce808887df84021cf18039ef9cbf4741b" exitCode=0 Apr 17 16:44:30.736030 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:30.735982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" event={"ID":"04855268-e070-4a32-bdb9-1d7af0b4d8cd","Type":"ContainerDied","Data":"ff57497df483f4fbf00262c71495dd8ce808887df84021cf18039ef9cbf4741b"} Apr 17 16:44:31.748586 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:31.748534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" event={"ID":"04855268-e070-4a32-bdb9-1d7af0b4d8cd","Type":"ContainerStarted","Data":"603a2d861dafeddac166196bde88057e21c364e7b5f6a06a5428eb746f8b2b72"} Apr 17 16:44:40.803690 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:40.803546 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5"] Apr 17 16:44:40.804176 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:40.803935 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" podUID="9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" containerName="main" containerID="cri-o://342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8" gracePeriod=30 Apr 17 16:44:41.088787 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.088748 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:44:41.167259 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.167231 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-model-cache\") pod \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " Apr 17 16:44:41.167259 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.167266 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-home\") pod \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " Apr 17 16:44:41.167529 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.167290 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnkxg\" (UniqueName: \"kubernetes.io/projected/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kube-api-access-nnkxg\") pod \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " Apr 17 16:44:41.167529 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.167368 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-tls-certs\") pod \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " Apr 17 16:44:41.167529 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.167417 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-dshm\") pod \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " Apr 17 16:44:41.167529 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.167441 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kserve-provision-location\") pod \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\" (UID: \"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0\") " Apr 17 16:44:41.167750 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.167561 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-model-cache" (OuterVolumeSpecName: "model-cache") pod "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" (UID: "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:41.167750 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.167577 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-home" (OuterVolumeSpecName: "home") pod "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" (UID: "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:41.169312 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.169263 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-model-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:44:41.169312 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.169285 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-home\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:44:41.170684 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.170641 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" (UID: "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:44:41.170803 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.170727 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-dshm" (OuterVolumeSpecName: "dshm") pod "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" (UID: "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:41.170941 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.170898 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kube-api-access-nnkxg" (OuterVolumeSpecName: "kube-api-access-nnkxg") pod "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" (UID: "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0"). InnerVolumeSpecName "kube-api-access-nnkxg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:44:41.253984 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.253875 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" (UID: "9b789eac-b7f2-4dc7-b92d-c1bed098d4a0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:41.270429 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.270395 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:44:41.270429 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.270427 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-dshm\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:44:41.270692 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.270442 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:44:41.270692 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.270458 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nnkxg\" (UniqueName: \"kubernetes.io/projected/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0-kube-api-access-nnkxg\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:44:41.791195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.791158 2576 generic.go:358] "Generic (PLEG): container finished" podID="9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" containerID="342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8" exitCode=0 Apr 17 16:44:41.791195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.791189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" event={"ID":"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0","Type":"ContainerDied","Data":"342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8"} Apr 17 16:44:41.791437 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.791242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" event={"ID":"9b789eac-b7f2-4dc7-b92d-c1bed098d4a0","Type":"ContainerDied","Data":"2bdf5a5aecf714a98564376c4529f2a4de46773fd67ddcff88af8bf1d4ac9f8d"} Apr 17 16:44:41.791437 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.791250 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5" Apr 17 16:44:41.791437 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.791254 2576 scope.go:117] "RemoveContainer" containerID="342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8" Apr 17 16:44:41.802307 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.802286 2576 scope.go:117] "RemoveContainer" containerID="66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c" Apr 17 16:44:41.813214 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.813022 2576 scope.go:117] "RemoveContainer" containerID="342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8" Apr 17 16:44:41.813455 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:44:41.813299 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8\": container with ID starting with 342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8 not found: ID does not exist" containerID="342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8" Apr 17 16:44:41.813455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.813327 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8"} err="failed to get container status \"342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8\": rpc error: code = NotFound desc = could not find container \"342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8\": container with ID starting with 342d06b51665fbb5222c85e68c6f2183f5f6b3396b47b6c4e378053c9998a2d8 not found: ID does not exist" Apr 17 16:44:41.813455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.813346 2576 scope.go:117] "RemoveContainer" containerID="66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c" Apr 17 16:44:41.813676 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:44:41.813622 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c\": container with ID starting with 66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c not found: ID does not exist" containerID="66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c" Apr 17 16:44:41.813676 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.813653 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c"} err="failed to get container status \"66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c\": rpc error: code = NotFound desc = could not find container \"66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c\": container with ID starting with 66c0b9ebe0977f138a5e733396f894be8d24122289b4f928c4169405fd50602c not found: ID does not exist" Apr 17 16:44:41.819557 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.819531 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5"] Apr 17 16:44:41.824087 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:41.824066 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-6b875cbc8b84dz5"] Apr 17 16:44:43.144046 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:44:43.144008 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" path="/var/lib/kubelet/pods/9b789eac-b7f2-4dc7-b92d-c1bed098d4a0/volumes" Apr 17 16:45:00.866374 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:00.866269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" event={"ID":"04855268-e070-4a32-bdb9-1d7af0b4d8cd","Type":"ContainerStarted","Data":"64d7919752c48dcd335e1abae48269fffdf43ce644efb4f5fb4e20cb7478d2b5"} Apr 17 16:45:00.866860 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:00.866465 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:45:00.886679 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:00.886617 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" podStartSLOduration=2.141453527 podStartE2EDuration="31.886600045s" podCreationTimestamp="2026-04-17 16:44:29 +0000 UTC" firstStartedPulling="2026-04-17 16:44:30.737251466 +0000 UTC m=+790.202434305" lastFinishedPulling="2026-04-17 16:45:00.482397964 +0000 UTC m=+819.947580823" observedRunningTime="2026-04-17 16:45:00.885378232 +0000 UTC m=+820.350561096" watchObservedRunningTime="2026-04-17 16:45:00.886600045 +0000 UTC m=+820.351782921" Apr 17 16:45:01.873531 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:01.873479 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 17 16:45:03.271986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:03.271953 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h"] Apr 17 16:45:03.272352 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:03.272312 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="main" containerID="cri-o://603a2d861dafeddac166196bde88057e21c364e7b5f6a06a5428eb746f8b2b72" gracePeriod=30 Apr 17 16:45:03.272572 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:03.272539 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="tokenizer" containerID="cri-o://64d7919752c48dcd335e1abae48269fffdf43ce644efb4f5fb4e20cb7478d2b5" gracePeriod=30 Apr 17 16:45:03.276280 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:03.276229 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 17 16:45:03.879333 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:03.879298 2576 generic.go:358] "Generic (PLEG): container finished" podID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerID="603a2d861dafeddac166196bde88057e21c364e7b5f6a06a5428eb746f8b2b72" exitCode=0 Apr 17 16:45:03.879531 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:03.879352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" event={"ID":"04855268-e070-4a32-bdb9-1d7af0b4d8cd","Type":"ContainerDied","Data":"603a2d861dafeddac166196bde88057e21c364e7b5f6a06a5428eb746f8b2b72"} Apr 17 16:45:09.538482 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:09.538442 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:45:13.273195 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:45:13.273165 2576 logging.go:55] [core] [Channel #23 SubChannel #24]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.34:9003", ServerName: "10.134.0.34:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.34:9003: connect: connection refused" Apr 17 16:45:14.273184 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:14.273142 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.34:9003\" within 1s: context deadline exceeded" Apr 17 16:45:23.273078 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:45:23.273050 2576 logging.go:55] [core] [Channel #25 SubChannel #26]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.34:9003", ServerName: "10.134.0.34:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.34:9003: connect: connection refused" Apr 17 16:45:24.273411 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:24.273366 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.34:9003\" within 1s: context deadline exceeded" Apr 17 16:45:33.273149 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:45:33.273115 2576 logging.go:55] [core] [Channel #27 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.34:9003", ServerName: "10.134.0.34:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.34:9003: connect: connection refused" Apr 17 16:45:33.984604 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:33.984579 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h_04855268-e070-4a32-bdb9-1d7af0b4d8cd/tokenizer/0.log" Apr 17 16:45:33.985284 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:33.985258 2576 generic.go:358] "Generic (PLEG): container finished" podID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerID="64d7919752c48dcd335e1abae48269fffdf43ce644efb4f5fb4e20cb7478d2b5" exitCode=137 Apr 17 16:45:33.985365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:33.985324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" event={"ID":"04855268-e070-4a32-bdb9-1d7af0b4d8cd","Type":"ContainerDied","Data":"64d7919752c48dcd335e1abae48269fffdf43ce644efb4f5fb4e20cb7478d2b5"} Apr 17 16:45:33.985365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:33.985359 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" event={"ID":"04855268-e070-4a32-bdb9-1d7af0b4d8cd","Type":"ContainerDied","Data":"5a6d7b5cc6c9582baddf1ba380285b139292cfeaf6fd854e03aefec231bf2beb"} Apr 17 16:45:33.985431 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:33.985370 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a6d7b5cc6c9582baddf1ba380285b139292cfeaf6fd854e03aefec231bf2beb" Apr 17 16:45:33.989218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:33.989199 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h_04855268-e070-4a32-bdb9-1d7af0b4d8cd/tokenizer/0.log" Apr 17 16:45:33.989822 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:33.989808 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:45:34.057157 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.057122 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kserve-provision-location\") pod \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " Apr 17 16:45:34.057346 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.057173 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tls-certs\") pod \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " Apr 17 16:45:34.057346 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.057195 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-tmp\") pod \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " Apr 17 16:45:34.057346 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.057223 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5jdz\" (UniqueName: \"kubernetes.io/projected/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kube-api-access-z5jdz\") pod \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " Apr 17 16:45:34.057346 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.057248 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-cache\") pod \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " Apr 17 16:45:34.057346 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.057290 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-uds\") pod \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\" (UID: \"04855268-e070-4a32-bdb9-1d7af0b4d8cd\") " Apr 17 16:45:34.057646 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.057618 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "04855268-e070-4a32-bdb9-1d7af0b4d8cd" (UID: "04855268-e070-4a32-bdb9-1d7af0b4d8cd"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:34.057717 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.057598 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "04855268-e070-4a32-bdb9-1d7af0b4d8cd" (UID: "04855268-e070-4a32-bdb9-1d7af0b4d8cd"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:34.057784 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.057758 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "04855268-e070-4a32-bdb9-1d7af0b4d8cd" (UID: "04855268-e070-4a32-bdb9-1d7af0b4d8cd"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:34.058076 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.058044 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "04855268-e070-4a32-bdb9-1d7af0b4d8cd" (UID: "04855268-e070-4a32-bdb9-1d7af0b4d8cd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:34.059586 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.059559 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kube-api-access-z5jdz" (OuterVolumeSpecName: "kube-api-access-z5jdz") pod "04855268-e070-4a32-bdb9-1d7af0b4d8cd" (UID: "04855268-e070-4a32-bdb9-1d7af0b4d8cd"). InnerVolumeSpecName "kube-api-access-z5jdz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:45:34.059586 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.059575 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "04855268-e070-4a32-bdb9-1d7af0b4d8cd" (UID: "04855268-e070-4a32-bdb9-1d7af0b4d8cd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:45:34.159061 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.159034 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:45:34.159061 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.159062 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:45:34.159239 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.159074 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-tmp\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:45:34.159239 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.159082 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z5jdz\" (UniqueName: \"kubernetes.io/projected/04855268-e070-4a32-bdb9-1d7af0b4d8cd-kube-api-access-z5jdz\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:45:34.159239 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.159091 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:45:34.159239 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.159100 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04855268-e070-4a32-bdb9-1d7af0b4d8cd-tokenizer-uds\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:45:34.273869 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.273826 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.34:9003\" within 1s: context deadline exceeded" Apr 17 16:45:34.989790 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:34.989760 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h" Apr 17 16:45:35.013848 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:35.013813 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h"] Apr 17 16:45:35.017118 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:35.017093 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c69j75h"] Apr 17 16:45:35.143141 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:35.143106 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" path="/var/lib/kubelet/pods/04855268-e070-4a32-bdb9-1d7af0b4d8cd/volumes" Apr 17 16:45:48.564347 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.564310 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv"] Apr 17 16:45:48.564926 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.564898 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" containerName="storage-initializer" Apr 17 16:45:48.564926 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.564927 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" containerName="storage-initializer" Apr 17 16:45:48.565077 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.564947 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="storage-initializer" Apr 17 16:45:48.565077 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.564956 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="storage-initializer" Apr 17 16:45:48.565077 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.564970 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="tokenizer" Apr 17 16:45:48.565077 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.564980 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="tokenizer" Apr 17 16:45:48.565077 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.564996 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="main" Apr 17 16:45:48.565077 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.565004 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="main" Apr 17 16:45:48.565077 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.565029 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" containerName="main" Apr 17 16:45:48.565077 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.565037 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" containerName="main" Apr 17 16:45:48.565513 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.565126 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b789eac-b7f2-4dc7-b92d-c1bed098d4a0" containerName="main" Apr 17 16:45:48.565513 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.565138 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="tokenizer" Apr 17 16:45:48.565513 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.565147 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="04855268-e070-4a32-bdb9-1d7af0b4d8cd" containerName="main" Apr 17 16:45:48.568934 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.568918 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.571729 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.571702 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:45:48.571839 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.571726 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-round-trip-kserve-self-signed-certs\"" Apr 17 16:45:48.575601 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.575575 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv"] Apr 17 16:45:48.693963 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.693921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ngd6\" (UniqueName: \"kubernetes.io/projected/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kube-api-access-2ngd6\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.693963 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.693966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-dshm\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.694199 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.694068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-tls-certs\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.694199 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.694166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-model-cache\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.694286 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.694204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-home\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.694286 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.694244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kserve-provision-location\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.795394 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.795353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ngd6\" (UniqueName: \"kubernetes.io/projected/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kube-api-access-2ngd6\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.795394 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.795399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-dshm\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.795653 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.795548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-tls-certs\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.795709 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.795694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-model-cache\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.795767 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.795737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-home\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.795816 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.795790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kserve-provision-location\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.796065 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.796043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-model-cache\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.796155 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.796138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kserve-provision-location\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.796155 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.796108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-home\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.797625 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.797605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-dshm\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.798096 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.798077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-tls-certs\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.802965 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.802942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ngd6\" (UniqueName: \"kubernetes.io/projected/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kube-api-access-2ngd6\") pod \"conv-test-round-trip-kserve-8f7bd698b-h6fdv\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:48.882590 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:48.882473 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:45:49.013968 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:49.013800 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv"] Apr 17 16:45:49.016412 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:45:49.016382 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ec4369_55b6_467e_a2e1_bc214b5bb5b3.slice/crio-e2e08a82042f5d26b458644facbaad47caa58ca75a3f19c8819147a82e990cc5 WatchSource:0}: Error finding container e2e08a82042f5d26b458644facbaad47caa58ca75a3f19c8819147a82e990cc5: Status 404 returned error can't find the container with id e2e08a82042f5d26b458644facbaad47caa58ca75a3f19c8819147a82e990cc5 Apr 17 16:45:49.040276 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:49.040246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" event={"ID":"93ec4369-55b6-467e-a2e1-bc214b5bb5b3","Type":"ContainerStarted","Data":"e2e08a82042f5d26b458644facbaad47caa58ca75a3f19c8819147a82e990cc5"} Apr 17 16:45:50.045518 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:50.045448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" event={"ID":"93ec4369-55b6-467e-a2e1-bc214b5bb5b3","Type":"ContainerStarted","Data":"0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc"} Apr 17 16:45:55.064754 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:55.064718 2576 generic.go:358] "Generic (PLEG): container finished" podID="93ec4369-55b6-467e-a2e1-bc214b5bb5b3" containerID="0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc" exitCode=0 Apr 17 16:45:55.065140 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:45:55.064794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" event={"ID":"93ec4369-55b6-467e-a2e1-bc214b5bb5b3","Type":"ContainerDied","Data":"0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc"} Apr 17 16:46:01.182728 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.182669 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb"] Apr 17 16:46:01.218285 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.218013 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb"] Apr 17 16:46:01.218285 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.218130 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.223221 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.223102 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-6ggx9\"" Apr 17 16:46:01.223392 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.223273 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 17 16:46:01.322082 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.321822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.322082 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.321975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.322082 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.322017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.322376 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.322097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmbnx\" (UniqueName: \"kubernetes.io/projected/53881084-45b9-45db-bbe2-a26292e7913b-kube-api-access-xmbnx\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.322376 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.322122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53881084-45b9-45db-bbe2-a26292e7913b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.322376 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.322141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.423462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.422869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmbnx\" (UniqueName: \"kubernetes.io/projected/53881084-45b9-45db-bbe2-a26292e7913b-kube-api-access-xmbnx\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.423462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.422932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53881084-45b9-45db-bbe2-a26292e7913b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.423462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.422960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.423462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.423045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.423462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.423121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.423462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.423148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.423918 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.423471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.423918 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.423572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.423918 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.423665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.424076 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.423979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.428769 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.428740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53881084-45b9-45db-bbe2-a26292e7913b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.434134 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.434033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmbnx\" (UniqueName: \"kubernetes.io/projected/53881084-45b9-45db-bbe2-a26292e7913b-kube-api-access-xmbnx\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:01.533369 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:01.533329 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:03.952893 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:03.952853 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv"] Apr 17 16:46:07.684922 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:07.684869 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb"] Apr 17 16:46:07.686744 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:46:07.686711 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53881084_45b9_45db_bbe2_a26292e7913b.slice/crio-ce7763f6a0b321540609f46eeb8ec88492c1b59c766228874868fcefac490dca WatchSource:0}: Error finding container ce7763f6a0b321540609f46eeb8ec88492c1b59c766228874868fcefac490dca: Status 404 returned error can't find the container with id ce7763f6a0b321540609f46eeb8ec88492c1b59c766228874868fcefac490dca Apr 17 16:46:08.140755 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:08.140666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" event={"ID":"53881084-45b9-45db-bbe2-a26292e7913b","Type":"ContainerStarted","Data":"575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111"} Apr 17 16:46:08.140755 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:08.140710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" event={"ID":"53881084-45b9-45db-bbe2-a26292e7913b","Type":"ContainerStarted","Data":"ce7763f6a0b321540609f46eeb8ec88492c1b59c766228874868fcefac490dca"} Apr 17 16:46:09.145426 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:09.145344 2576 generic.go:358] "Generic (PLEG): container finished" podID="53881084-45b9-45db-bbe2-a26292e7913b" containerID="575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111" exitCode=0 Apr 17 16:46:09.145426 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:09.145400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" event={"ID":"53881084-45b9-45db-bbe2-a26292e7913b","Type":"ContainerDied","Data":"575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111"} Apr 17 16:46:21.190818 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:21.190782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" event={"ID":"53881084-45b9-45db-bbe2-a26292e7913b","Type":"ContainerStarted","Data":"863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642"} Apr 17 16:46:22.224157 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:22.224121 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:46:22.224711 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:22.224291 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:46:23.200025 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:23.199984 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" event={"ID":"53881084-45b9-45db-bbe2-a26292e7913b","Type":"ContainerStarted","Data":"7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572"} Apr 17 16:46:23.200231 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:23.200114 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:23.202018 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:23.201987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" event={"ID":"93ec4369-55b6-467e-a2e1-bc214b5bb5b3","Type":"ContainerStarted","Data":"7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01"} Apr 17 16:46:23.202135 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:23.202041 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" podUID="93ec4369-55b6-467e-a2e1-bc214b5bb5b3" containerName="main" containerID="cri-o://7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01" gracePeriod=30 Apr 17 16:46:23.223295 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:23.223244 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" podStartSLOduration=22.223228467 podStartE2EDuration="22.223228467s" podCreationTimestamp="2026-04-17 16:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:46:23.22069792 +0000 UTC m=+902.685880785" watchObservedRunningTime="2026-04-17 16:46:23.223228467 +0000 UTC m=+902.688411328" Apr 17 16:46:23.239835 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:23.239778 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" podStartSLOduration=7.6620426550000005 podStartE2EDuration="35.239758736s" podCreationTimestamp="2026-04-17 16:45:48 +0000 UTC" firstStartedPulling="2026-04-17 16:45:55.06598429 +0000 UTC m=+874.531167129" lastFinishedPulling="2026-04-17 16:46:22.643700371 +0000 UTC m=+902.108883210" observedRunningTime="2026-04-17 16:46:23.238438571 +0000 UTC m=+902.703621433" watchObservedRunningTime="2026-04-17 16:46:23.239758736 +0000 UTC m=+902.704941633" Apr 17 16:46:28.883362 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:28.883313 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:46:31.534417 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:31.534373 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:31.534417 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:31.534419 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:31.537372 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:31.537343 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:32.235620 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:32.235588 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:53.239488 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.239461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:46:53.469896 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.469863 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-8f7bd698b-h6fdv_93ec4369-55b6-467e-a2e1-bc214b5bb5b3/main/0.log" Apr 17 16:46:53.470294 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.470277 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:46:53.535900 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.535869 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-dshm\") pod \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " Apr 17 16:46:53.535900 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.535903 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kserve-provision-location\") pod \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " Apr 17 16:46:53.536108 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.535950 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-tls-certs\") pod \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " Apr 17 16:46:53.536108 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.535977 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-home\") pod \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " Apr 17 16:46:53.536108 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.536037 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ngd6\" (UniqueName: \"kubernetes.io/projected/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kube-api-access-2ngd6\") pod \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " Apr 17 16:46:53.536108 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.536055 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-model-cache\") pod \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\" (UID: \"93ec4369-55b6-467e-a2e1-bc214b5bb5b3\") " Apr 17 16:46:53.536383 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.536332 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-home" (OuterVolumeSpecName: "home") pod "93ec4369-55b6-467e-a2e1-bc214b5bb5b3" (UID: "93ec4369-55b6-467e-a2e1-bc214b5bb5b3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:46:53.536636 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.536608 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-model-cache" (OuterVolumeSpecName: "model-cache") pod "93ec4369-55b6-467e-a2e1-bc214b5bb5b3" (UID: "93ec4369-55b6-467e-a2e1-bc214b5bb5b3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:46:53.538146 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.538125 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "93ec4369-55b6-467e-a2e1-bc214b5bb5b3" (UID: "93ec4369-55b6-467e-a2e1-bc214b5bb5b3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:46:53.538410 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.538389 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-dshm" (OuterVolumeSpecName: "dshm") pod "93ec4369-55b6-467e-a2e1-bc214b5bb5b3" (UID: "93ec4369-55b6-467e-a2e1-bc214b5bb5b3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:46:53.538551 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.538532 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kube-api-access-2ngd6" (OuterVolumeSpecName: "kube-api-access-2ngd6") pod "93ec4369-55b6-467e-a2e1-bc214b5bb5b3" (UID: "93ec4369-55b6-467e-a2e1-bc214b5bb5b3"). InnerVolumeSpecName "kube-api-access-2ngd6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:46:53.590975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.590929 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "93ec4369-55b6-467e-a2e1-bc214b5bb5b3" (UID: "93ec4369-55b6-467e-a2e1-bc214b5bb5b3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:46:53.636797 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.636772 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ngd6\" (UniqueName: \"kubernetes.io/projected/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kube-api-access-2ngd6\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:46:53.636797 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.636798 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-model-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:46:53.636968 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.636808 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-dshm\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:46:53.636968 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.636817 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:46:53.636968 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.636826 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:46:53.636968 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:53.636835 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/93ec4369-55b6-467e-a2e1-bc214b5bb5b3-home\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:46:54.320462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.320434 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-8f7bd698b-h6fdv_93ec4369-55b6-467e-a2e1-bc214b5bb5b3/main/0.log" Apr 17 16:46:54.320944 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.320784 2576 generic.go:358] "Generic (PLEG): container finished" podID="93ec4369-55b6-467e-a2e1-bc214b5bb5b3" containerID="7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01" exitCode=137 Apr 17 16:46:54.320944 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.320874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" event={"ID":"93ec4369-55b6-467e-a2e1-bc214b5bb5b3","Type":"ContainerDied","Data":"7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01"} Apr 17 16:46:54.320944 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.320900 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" Apr 17 16:46:54.320944 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.320918 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv" event={"ID":"93ec4369-55b6-467e-a2e1-bc214b5bb5b3","Type":"ContainerDied","Data":"e2e08a82042f5d26b458644facbaad47caa58ca75a3f19c8819147a82e990cc5"} Apr 17 16:46:54.320944 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.320934 2576 scope.go:117] "RemoveContainer" containerID="7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01" Apr 17 16:46:54.329914 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.329897 2576 scope.go:117] "RemoveContainer" containerID="0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc" Apr 17 16:46:54.340023 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.340006 2576 scope.go:117] "RemoveContainer" containerID="7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01" Apr 17 16:46:54.340241 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:46:54.340224 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01\": container with ID starting with 7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01 not found: ID does not exist" containerID="7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01" Apr 17 16:46:54.340279 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.340251 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01"} err="failed to get container status \"7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01\": rpc error: code = NotFound desc = could not find container \"7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01\": container with ID starting with 7580588fe88017deedb4c1c1fc04f338948a5729142b79c9a0ad80a752cd4b01 not found: ID does not exist" Apr 17 16:46:54.340279 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.340275 2576 scope.go:117] "RemoveContainer" containerID="0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc" Apr 17 16:46:54.340534 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:46:54.340506 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc\": container with ID starting with 0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc not found: ID does not exist" containerID="0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc" Apr 17 16:46:54.340577 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.340545 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc"} err="failed to get container status \"0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc\": rpc error: code = NotFound desc = could not find container \"0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc\": container with ID starting with 0f430d6c325ef0dbb6086472e18c37a5c27d05f4c522a5d8dc71b24016dc25fc not found: ID does not exist" Apr 17 16:46:54.344897 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.344876 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv"] Apr 17 16:46:54.349562 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:54.349540 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-8f7bd698b-h6fdv"] Apr 17 16:46:55.142723 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:46:55.142684 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ec4369-55b6-467e-a2e1-bc214b5bb5b3" path="/var/lib/kubelet/pods/93ec4369-55b6-467e-a2e1-bc214b5bb5b3/volumes" Apr 17 16:47:30.652395 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.652352 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf"] Apr 17 16:47:30.653066 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.652841 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93ec4369-55b6-467e-a2e1-bc214b5bb5b3" containerName="storage-initializer" Apr 17 16:47:30.653066 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.652855 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ec4369-55b6-467e-a2e1-bc214b5bb5b3" containerName="storage-initializer" Apr 17 16:47:30.653066 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.652882 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93ec4369-55b6-467e-a2e1-bc214b5bb5b3" containerName="main" Apr 17 16:47:30.653066 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.652890 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ec4369-55b6-467e-a2e1-bc214b5bb5b3" containerName="main" Apr 17 16:47:30.653066 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.652942 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="93ec4369-55b6-467e-a2e1-bc214b5bb5b3" containerName="main" Apr 17 16:47:30.657293 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.657272 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.659519 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.659477 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 17 16:47:30.667171 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.667148 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf"] Apr 17 16:47:30.671242 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.671219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.671385 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.671259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-dshm\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.671385 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.671287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dps9s\" (UniqueName: \"kubernetes.io/projected/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kube-api-access-dps9s\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.671490 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.671382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-tls-certs\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.671560 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.671522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-home\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.671560 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.671551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-model-cache\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.772172 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.772133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-home\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.772172 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.772170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-model-cache\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.772394 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.772200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.772394 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.772234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-dshm\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.772394 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.772258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dps9s\" (UniqueName: \"kubernetes.io/projected/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kube-api-access-dps9s\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.772394 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.772330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-tls-certs\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.772679 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.772652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-home\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.772719 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.772675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.772770 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.772726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-model-cache\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.774688 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.774670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-dshm\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.774970 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.774951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-tls-certs\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.781400 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.781370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dps9s\" (UniqueName: \"kubernetes.io/projected/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kube-api-access-dps9s\") pod \"custom-route-timeout-test-kserve-fbd7f9949-z6pkf\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:30.967959 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:30.967874 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:31.096667 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:31.096640 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf"] Apr 17 16:47:31.098695 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:47:31.098666 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73a0e96b_1e5f_4541_9c54_7ce7e6f62ad2.slice/crio-488d427dbda14416d07cf5ec943d42510c4486539ca17f199fe421c8a7b86ce3 WatchSource:0}: Error finding container 488d427dbda14416d07cf5ec943d42510c4486539ca17f199fe421c8a7b86ce3: Status 404 returned error can't find the container with id 488d427dbda14416d07cf5ec943d42510c4486539ca17f199fe421c8a7b86ce3 Apr 17 16:47:31.460389 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:31.460350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" event={"ID":"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2","Type":"ContainerStarted","Data":"95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349"} Apr 17 16:47:31.460574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:31.460395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" event={"ID":"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2","Type":"ContainerStarted","Data":"488d427dbda14416d07cf5ec943d42510c4486539ca17f199fe421c8a7b86ce3"} Apr 17 16:47:35.478181 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:35.478072 2576 generic.go:358] "Generic (PLEG): container finished" podID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerID="95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349" exitCode=0 Apr 17 16:47:35.478181 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:35.478127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" event={"ID":"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2","Type":"ContainerDied","Data":"95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349"} Apr 17 16:47:36.484557 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:36.484511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" event={"ID":"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2","Type":"ContainerStarted","Data":"af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299"} Apr 17 16:47:36.505292 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:36.505237 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podStartSLOduration=6.505220283 podStartE2EDuration="6.505220283s" podCreationTimestamp="2026-04-17 16:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:47:36.503097198 +0000 UTC m=+975.968280063" watchObservedRunningTime="2026-04-17 16:47:36.505220283 +0000 UTC m=+975.970403143" Apr 17 16:47:40.968986 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:40.968953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:40.969349 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:40.968996 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:47:40.970780 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:40.970749 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 17 16:47:50.968455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:50.968401 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 17 16:47:52.836626 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:52.836586 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb"] Apr 17 16:47:52.837057 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:52.836923 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="main" containerID="cri-o://863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642" gracePeriod=30 Apr 17 16:47:52.837057 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:52.836983 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="tokenizer" containerID="cri-o://7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572" gracePeriod=30 Apr 17 16:47:53.238057 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:47:53.238020 2576 logging.go:55] [core] [Channel #74 SubChannel #75]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.36:9003", ServerName: "10.134.0.36:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.36:9003: connect: connection refused" Apr 17 16:47:53.546972 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:53.546895 2576 generic.go:358] "Generic (PLEG): container finished" podID="53881084-45b9-45db-bbe2-a26292e7913b" containerID="863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642" exitCode=0 Apr 17 16:47:53.547105 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:53.546973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" event={"ID":"53881084-45b9-45db-bbe2-a26292e7913b","Type":"ContainerDied","Data":"863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642"} Apr 17 16:47:54.196054 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.196028 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:47:54.238302 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.238208 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.36:9003\" within 1s: context deadline exceeded" Apr 17 16:47:54.310489 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.310451 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-kserve-provision-location\") pod \"53881084-45b9-45db-bbe2-a26292e7913b\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " Apr 17 16:47:54.310697 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.310553 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-cache\") pod \"53881084-45b9-45db-bbe2-a26292e7913b\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " Apr 17 16:47:54.310697 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.310578 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmbnx\" (UniqueName: \"kubernetes.io/projected/53881084-45b9-45db-bbe2-a26292e7913b-kube-api-access-xmbnx\") pod \"53881084-45b9-45db-bbe2-a26292e7913b\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " Apr 17 16:47:54.310697 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.310671 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-uds\") pod \"53881084-45b9-45db-bbe2-a26292e7913b\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " Apr 17 16:47:54.310697 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.310693 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-tmp\") pod \"53881084-45b9-45db-bbe2-a26292e7913b\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " Apr 17 16:47:54.310906 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.310739 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53881084-45b9-45db-bbe2-a26292e7913b-tls-certs\") pod \"53881084-45b9-45db-bbe2-a26292e7913b\" (UID: \"53881084-45b9-45db-bbe2-a26292e7913b\") " Apr 17 16:47:54.310906 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.310814 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "53881084-45b9-45db-bbe2-a26292e7913b" (UID: "53881084-45b9-45db-bbe2-a26292e7913b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:54.311016 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.310894 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "53881084-45b9-45db-bbe2-a26292e7913b" (UID: "53881084-45b9-45db-bbe2-a26292e7913b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:54.311100 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.311073 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-uds\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:47:54.311167 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.311114 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:47:54.311167 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.311083 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "53881084-45b9-45db-bbe2-a26292e7913b" (UID: "53881084-45b9-45db-bbe2-a26292e7913b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:54.311391 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.311364 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "53881084-45b9-45db-bbe2-a26292e7913b" (UID: "53881084-45b9-45db-bbe2-a26292e7913b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:54.312828 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.312802 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53881084-45b9-45db-bbe2-a26292e7913b-kube-api-access-xmbnx" (OuterVolumeSpecName: "kube-api-access-xmbnx") pod "53881084-45b9-45db-bbe2-a26292e7913b" (UID: "53881084-45b9-45db-bbe2-a26292e7913b"). InnerVolumeSpecName "kube-api-access-xmbnx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:47:54.312917 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.312862 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53881084-45b9-45db-bbe2-a26292e7913b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "53881084-45b9-45db-bbe2-a26292e7913b" (UID: "53881084-45b9-45db-bbe2-a26292e7913b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:47:54.411708 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.411670 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xmbnx\" (UniqueName: \"kubernetes.io/projected/53881084-45b9-45db-bbe2-a26292e7913b-kube-api-access-xmbnx\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:47:54.411708 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.411703 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-tokenizer-tmp\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:47:54.411708 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.411716 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53881084-45b9-45db-bbe2-a26292e7913b-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:47:54.411936 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.411729 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53881084-45b9-45db-bbe2-a26292e7913b-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:47:54.551755 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.551656 2576 generic.go:358] "Generic (PLEG): container finished" podID="53881084-45b9-45db-bbe2-a26292e7913b" containerID="7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572" exitCode=0 Apr 17 16:47:54.551755 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.551712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" event={"ID":"53881084-45b9-45db-bbe2-a26292e7913b","Type":"ContainerDied","Data":"7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572"} Apr 17 16:47:54.551755 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.551746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" event={"ID":"53881084-45b9-45db-bbe2-a26292e7913b","Type":"ContainerDied","Data":"ce7763f6a0b321540609f46eeb8ec88492c1b59c766228874868fcefac490dca"} Apr 17 16:47:54.551755 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.551753 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb" Apr 17 16:47:54.552049 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.551767 2576 scope.go:117] "RemoveContainer" containerID="7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572" Apr 17 16:47:54.562107 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.562090 2576 scope.go:117] "RemoveContainer" containerID="863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642" Apr 17 16:47:54.571590 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.571576 2576 scope.go:117] "RemoveContainer" containerID="575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111" Apr 17 16:47:54.574863 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.574839 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb"] Apr 17 16:47:54.576429 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.576406 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-zkvlb"] Apr 17 16:47:54.580196 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.580177 2576 scope.go:117] "RemoveContainer" containerID="7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572" Apr 17 16:47:54.580444 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:47:54.580426 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572\": container with ID starting with 7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572 not found: ID does not exist" containerID="7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572" Apr 17 16:47:54.580553 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.580454 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572"} err="failed to get container status \"7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572\": rpc error: code = NotFound desc = could not find container \"7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572\": container with ID starting with 7feaea3d7c4954ce163b5141f9cac5b46cc3c1e3cbb4b48db8e3ca4307ef5572 not found: ID does not exist" Apr 17 16:47:54.580553 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.580473 2576 scope.go:117] "RemoveContainer" containerID="863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642" Apr 17 16:47:54.580720 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:47:54.580702 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642\": container with ID starting with 863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642 not found: ID does not exist" containerID="863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642" Apr 17 16:47:54.580794 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.580722 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642"} err="failed to get container status \"863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642\": rpc error: code = NotFound desc = could not find container \"863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642\": container with ID starting with 863e5267077d839df4b2b24adc3e329049ad0e6ad9fd6001b22054b9e6ae1642 not found: ID does not exist" Apr 17 16:47:54.580794 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.580735 2576 scope.go:117] "RemoveContainer" containerID="575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111" Apr 17 16:47:54.580962 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:47:54.580947 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111\": container with ID starting with 575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111 not found: ID does not exist" containerID="575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111" Apr 17 16:47:54.581017 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:54.580966 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111"} err="failed to get container status \"575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111\": rpc error: code = NotFound desc = could not find container \"575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111\": container with ID starting with 575a4f33483a8cd40b12a9615b8947bea4ffeb247835c88e82e335c983bfe111 not found: ID does not exist" Apr 17 16:47:55.143559 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:47:55.143524 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53881084-45b9-45db-bbe2-a26292e7913b" path="/var/lib/kubelet/pods/53881084-45b9-45db-bbe2-a26292e7913b/volumes" Apr 17 16:48:00.969189 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:00.969138 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 17 16:48:08.930702 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.930646 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7"] Apr 17 16:48:08.931286 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.931241 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="tokenizer" Apr 17 16:48:08.931286 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.931263 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="tokenizer" Apr 17 16:48:08.931286 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.931276 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="main" Apr 17 16:48:08.931286 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.931285 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="main" Apr 17 16:48:08.931533 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.931318 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="storage-initializer" Apr 17 16:48:08.931533 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.931328 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="storage-initializer" Apr 17 16:48:08.931533 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.931403 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="main" Apr 17 16:48:08.931533 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.931414 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="53881084-45b9-45db-bbe2-a26292e7913b" containerName="tokenizer" Apr 17 16:48:08.938655 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.938627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:08.941828 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.941804 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-4pf7t\"" Apr 17 16:48:08.941993 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.941904 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 17 16:48:08.942898 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.942873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:08.943079 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.943059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e49685e-7767-4ff7-b2a8-d28a70162388-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:08.943244 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.943226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:08.943366 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.943351 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:08.943506 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.943478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:08.943633 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.943618 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl9bf\" (UniqueName: \"kubernetes.io/projected/8e49685e-7767-4ff7-b2a8-d28a70162388-kube-api-access-hl9bf\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:08.945823 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:08.945802 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7"] Apr 17 16:48:09.044254 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.044221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.044254 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.044265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e49685e-7767-4ff7-b2a8-d28a70162388-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.044542 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.044320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.044542 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.044348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.044542 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.044397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.044542 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.044424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl9bf\" (UniqueName: \"kubernetes.io/projected/8e49685e-7767-4ff7-b2a8-d28a70162388-kube-api-access-hl9bf\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.044817 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.044793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.044907 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.044840 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.044907 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.044874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.045016 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.044909 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.047068 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.047044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e49685e-7767-4ff7-b2a8-d28a70162388-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.052969 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.052939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl9bf\" (UniqueName: \"kubernetes.io/projected/8e49685e-7767-4ff7-b2a8-d28a70162388-kube-api-access-hl9bf\") pod \"stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.251667 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.251572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:09.390382 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.390344 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7"] Apr 17 16:48:09.395186 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:48:09.395155 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e49685e_7767_4ff7_b2a8_d28a70162388.slice/crio-64f2afb9d686839be23c905ac86a0f462f4f59f9ecab16b1a4465dcd37d4b2dd WatchSource:0}: Error finding container 64f2afb9d686839be23c905ac86a0f462f4f59f9ecab16b1a4465dcd37d4b2dd: Status 404 returned error can't find the container with id 64f2afb9d686839be23c905ac86a0f462f4f59f9ecab16b1a4465dcd37d4b2dd Apr 17 16:48:09.608306 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.608217 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" event={"ID":"8e49685e-7767-4ff7-b2a8-d28a70162388","Type":"ContainerStarted","Data":"b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d"} Apr 17 16:48:09.608306 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:09.608258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" event={"ID":"8e49685e-7767-4ff7-b2a8-d28a70162388","Type":"ContainerStarted","Data":"64f2afb9d686839be23c905ac86a0f462f4f59f9ecab16b1a4465dcd37d4b2dd"} Apr 17 16:48:10.613377 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:10.613337 2576 generic.go:358] "Generic (PLEG): container finished" podID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerID="b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d" exitCode=0 Apr 17 16:48:10.613771 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:10.613446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" event={"ID":"8e49685e-7767-4ff7-b2a8-d28a70162388","Type":"ContainerDied","Data":"b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d"} Apr 17 16:48:10.969122 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:10.969077 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 17 16:48:11.619898 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:11.619863 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" event={"ID":"8e49685e-7767-4ff7-b2a8-d28a70162388","Type":"ContainerStarted","Data":"ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536"} Apr 17 16:48:11.619898 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:11.619901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" event={"ID":"8e49685e-7767-4ff7-b2a8-d28a70162388","Type":"ContainerStarted","Data":"e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616"} Apr 17 16:48:11.620327 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:11.620030 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:11.643020 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:11.642954 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" podStartSLOduration=3.642931622 podStartE2EDuration="3.642931622s" podCreationTimestamp="2026-04-17 16:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:48:11.639996732 +0000 UTC m=+1011.105179592" watchObservedRunningTime="2026-04-17 16:48:11.642931622 +0000 UTC m=+1011.108114484" Apr 17 16:48:19.252643 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:19.252604 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:19.253121 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:19.252656 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:19.255465 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:19.255439 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:19.649740 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:19.649714 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:20.968449 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:20.968401 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 17 16:48:30.968701 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:30.968652 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 17 16:48:40.653826 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:40.653793 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:48:40.968522 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:40.968392 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 17 16:48:50.968677 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:48:50.968637 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 17 16:49:00.968961 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:00.968911 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 17 16:49:10.978683 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:10.978605 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:49:10.986680 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:10.986649 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:49:16.722820 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:16.722775 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf"] Apr 17 16:49:16.723300 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:16.723034 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" containerID="cri-o://af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299" gracePeriod=30 Apr 17 16:49:46.964025 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:46.964002 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-fbd7f9949-z6pkf_73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2/main/0.log" Apr 17 16:49:46.964362 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:46.964343 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:49:46.979328 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:46.979261 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-fbd7f9949-z6pkf_73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2/main/0.log" Apr 17 16:49:46.979625 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:46.979601 2576 generic.go:358] "Generic (PLEG): container finished" podID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerID="af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299" exitCode=137 Apr 17 16:49:46.979688 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:46.979676 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" Apr 17 16:49:46.979725 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:46.979680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" event={"ID":"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2","Type":"ContainerDied","Data":"af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299"} Apr 17 16:49:46.979725 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:46.979718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf" event={"ID":"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2","Type":"ContainerDied","Data":"488d427dbda14416d07cf5ec943d42510c4486539ca17f199fe421c8a7b86ce3"} Apr 17 16:49:46.979787 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:46.979733 2576 scope.go:117] "RemoveContainer" containerID="af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299" Apr 17 16:49:46.999820 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:46.999795 2576 scope.go:117] "RemoveContainer" containerID="95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349" Apr 17 16:49:47.053350 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.053318 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dps9s\" (UniqueName: \"kubernetes.io/projected/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kube-api-access-dps9s\") pod \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " Apr 17 16:49:47.053555 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.053387 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kserve-provision-location\") pod \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " Apr 17 16:49:47.053555 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.053415 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-model-cache\") pod \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " Apr 17 16:49:47.053555 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.053445 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-home\") pod \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " Apr 17 16:49:47.053555 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.053536 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-dshm\") pod \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " Apr 17 16:49:47.053848 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.053562 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-tls-certs\") pod \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\" (UID: \"73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2\") " Apr 17 16:49:47.053848 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.053743 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-model-cache" (OuterVolumeSpecName: "model-cache") pod "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" (UID: "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:49:47.053953 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.053917 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-home" (OuterVolumeSpecName: "home") pod "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" (UID: "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:49:47.054006 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.053932 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-model-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:49:47.055557 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.055532 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kube-api-access-dps9s" (OuterVolumeSpecName: "kube-api-access-dps9s") pod "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" (UID: "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2"). InnerVolumeSpecName "kube-api-access-dps9s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:49:47.056103 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.056079 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-dshm" (OuterVolumeSpecName: "dshm") pod "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" (UID: "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:49:47.056178 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.056113 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" (UID: "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:49:47.063845 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.063828 2576 scope.go:117] "RemoveContainer" containerID="af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299" Apr 17 16:49:47.064202 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:49:47.064164 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299\": container with ID starting with af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299 not found: ID does not exist" containerID="af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299" Apr 17 16:49:47.064278 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.064207 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299"} err="failed to get container status \"af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299\": rpc error: code = NotFound desc = could not find container \"af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299\": container with ID starting with af8435686b14fe7616a97c7de72081e4242ff90b7fc54935b3e218008bc9d299 not found: ID does not exist" Apr 17 16:49:47.064278 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.064235 2576 scope.go:117] "RemoveContainer" containerID="95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349" Apr 17 16:49:47.064564 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:49:47.064537 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349\": container with ID starting with 95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349 not found: ID does not exist" containerID="95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349" Apr 17 16:49:47.064642 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.064581 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349"} err="failed to get container status \"95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349\": rpc error: code = NotFound desc = could not find container \"95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349\": container with ID starting with 95ebb5341089f980a8eb8ce657dee3fa83fc3cffe75689c625c3b762ab7f9349 not found: ID does not exist" Apr 17 16:49:47.113165 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.113100 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" (UID: "73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:49:47.155685 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.155652 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-dshm\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:49:47.155828 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.155726 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:49:47.155828 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.155746 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dps9s\" (UniqueName: \"kubernetes.io/projected/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kube-api-access-dps9s\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:49:47.155828 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.155758 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:49:47.155828 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.155768 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2-home\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:49:47.299012 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.298931 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf"] Apr 17 16:49:47.303370 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:47.303335 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-fbd7f9949-z6pkf"] Apr 17 16:49:49.142359 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:49:49.142326 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" path="/var/lib/kubelet/pods/73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2/volumes" Apr 17 16:50:06.462540 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:06.462484 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7"] Apr 17 16:50:06.462971 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:06.462827 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerName="main" containerID="cri-o://e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616" gracePeriod=30 Apr 17 16:50:06.462971 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:06.462890 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerName="tokenizer" containerID="cri-o://ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536" gracePeriod=30 Apr 17 16:50:07.062886 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.062850 2576 generic.go:358] "Generic (PLEG): container finished" podID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerID="e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616" exitCode=0 Apr 17 16:50:07.063081 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.062913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" event={"ID":"8e49685e-7767-4ff7-b2a8-d28a70162388","Type":"ContainerDied","Data":"e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616"} Apr 17 16:50:07.740978 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.740956 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:50:07.848720 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.848624 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-cache\") pod \"8e49685e-7767-4ff7-b2a8-d28a70162388\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " Apr 17 16:50:07.848720 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.848716 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-tmp\") pod \"8e49685e-7767-4ff7-b2a8-d28a70162388\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " Apr 17 16:50:07.848953 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.848764 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e49685e-7767-4ff7-b2a8-d28a70162388-tls-certs\") pod \"8e49685e-7767-4ff7-b2a8-d28a70162388\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " Apr 17 16:50:07.848953 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.848792 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl9bf\" (UniqueName: \"kubernetes.io/projected/8e49685e-7767-4ff7-b2a8-d28a70162388-kube-api-access-hl9bf\") pod \"8e49685e-7767-4ff7-b2a8-d28a70162388\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " Apr 17 16:50:07.848953 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.848818 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-uds\") pod \"8e49685e-7767-4ff7-b2a8-d28a70162388\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " Apr 17 16:50:07.848953 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.848865 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-kserve-provision-location\") pod \"8e49685e-7767-4ff7-b2a8-d28a70162388\" (UID: \"8e49685e-7767-4ff7-b2a8-d28a70162388\") " Apr 17 16:50:07.849117 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.848943 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8e49685e-7767-4ff7-b2a8-d28a70162388" (UID: "8e49685e-7767-4ff7-b2a8-d28a70162388"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:50:07.849117 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.849056 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8e49685e-7767-4ff7-b2a8-d28a70162388" (UID: "8e49685e-7767-4ff7-b2a8-d28a70162388"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:50:07.849199 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.849107 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8e49685e-7767-4ff7-b2a8-d28a70162388" (UID: "8e49685e-7767-4ff7-b2a8-d28a70162388"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:50:07.849199 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.849186 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-uds\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:50:07.849274 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.849205 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:50:07.849632 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.849610 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8e49685e-7767-4ff7-b2a8-d28a70162388" (UID: "8e49685e-7767-4ff7-b2a8-d28a70162388"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:50:07.850935 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.850915 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e49685e-7767-4ff7-b2a8-d28a70162388-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8e49685e-7767-4ff7-b2a8-d28a70162388" (UID: "8e49685e-7767-4ff7-b2a8-d28a70162388"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:50:07.850975 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.850936 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e49685e-7767-4ff7-b2a8-d28a70162388-kube-api-access-hl9bf" (OuterVolumeSpecName: "kube-api-access-hl9bf") pod "8e49685e-7767-4ff7-b2a8-d28a70162388" (UID: "8e49685e-7767-4ff7-b2a8-d28a70162388"). InnerVolumeSpecName "kube-api-access-hl9bf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:50:07.949907 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.949865 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:50:07.949907 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.949899 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8e49685e-7767-4ff7-b2a8-d28a70162388-tokenizer-tmp\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:50:07.949907 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.949910 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e49685e-7767-4ff7-b2a8-d28a70162388-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:50:07.949907 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:07.949918 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hl9bf\" (UniqueName: \"kubernetes.io/projected/8e49685e-7767-4ff7-b2a8-d28a70162388-kube-api-access-hl9bf\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:50:08.070590 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.070559 2576 generic.go:358] "Generic (PLEG): container finished" podID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerID="ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536" exitCode=0 Apr 17 16:50:08.070771 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.070632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" event={"ID":"8e49685e-7767-4ff7-b2a8-d28a70162388","Type":"ContainerDied","Data":"ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536"} Apr 17 16:50:08.070771 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.070640 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" Apr 17 16:50:08.070771 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.070658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7" event={"ID":"8e49685e-7767-4ff7-b2a8-d28a70162388","Type":"ContainerDied","Data":"64f2afb9d686839be23c905ac86a0f462f4f59f9ecab16b1a4465dcd37d4b2dd"} Apr 17 16:50:08.070771 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.070674 2576 scope.go:117] "RemoveContainer" containerID="ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536" Apr 17 16:50:08.079373 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.079354 2576 scope.go:117] "RemoveContainer" containerID="e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616" Apr 17 16:50:08.086838 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.086816 2576 scope.go:117] "RemoveContainer" containerID="b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d" Apr 17 16:50:08.091264 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.091237 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7"] Apr 17 16:50:08.094309 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.094289 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f8b84d599-ndlt7"] Apr 17 16:50:08.094851 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.094834 2576 scope.go:117] "RemoveContainer" containerID="ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536" Apr 17 16:50:08.095119 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:50:08.095102 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536\": container with ID starting with ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536 not found: ID does not exist" containerID="ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536" Apr 17 16:50:08.095171 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.095126 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536"} err="failed to get container status \"ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536\": rpc error: code = NotFound desc = could not find container \"ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536\": container with ID starting with ef82cde4353bc385450f2c63e601c46d430db54bd5db818b51cf848ffd386536 not found: ID does not exist" Apr 17 16:50:08.095171 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.095147 2576 scope.go:117] "RemoveContainer" containerID="e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616" Apr 17 16:50:08.095368 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:50:08.095350 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616\": container with ID starting with e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616 not found: ID does not exist" containerID="e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616" Apr 17 16:50:08.095405 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.095376 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616"} err="failed to get container status \"e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616\": rpc error: code = NotFound desc = could not find container \"e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616\": container with ID starting with e702976a723888ec687875f6889081de96bf16dd5085694d43b214c559bbc616 not found: ID does not exist" Apr 17 16:50:08.095405 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.095393 2576 scope.go:117] "RemoveContainer" containerID="b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d" Apr 17 16:50:08.095736 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:50:08.095719 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d\": container with ID starting with b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d not found: ID does not exist" containerID="b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d" Apr 17 16:50:08.095789 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:08.095739 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d"} err="failed to get container status \"b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d\": rpc error: code = NotFound desc = could not find container \"b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d\": container with ID starting with b20504b7a5aac4c5025a2bbfcd8bac2de38fae67f0408a340f24c54f27e76d7d not found: ID does not exist" Apr 17 16:50:09.143510 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:50:09.143475 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" path="/var/lib/kubelet/pods/8e49685e-7767-4ff7-b2a8-d28a70162388/volumes" Apr 17 16:51:16.573056 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573020 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz"] Apr 17 16:51:16.573579 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573561 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerName="storage-initializer" Apr 17 16:51:16.573636 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573581 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerName="storage-initializer" Apr 17 16:51:16.573636 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573602 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="storage-initializer" Apr 17 16:51:16.573636 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573618 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="storage-initializer" Apr 17 16:51:16.573636 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573629 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerName="main" Apr 17 16:51:16.573636 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573634 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerName="main" Apr 17 16:51:16.573783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573642 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" Apr 17 16:51:16.573783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573648 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" Apr 17 16:51:16.573783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573656 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerName="tokenizer" Apr 17 16:51:16.573783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573661 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerName="tokenizer" Apr 17 16:51:16.573783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573716 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="73a0e96b-1e5f-4541-9c54-7ce7e6f62ad2" containerName="main" Apr 17 16:51:16.573783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573724 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerName="tokenizer" Apr 17 16:51:16.573783 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.573732 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e49685e-7767-4ff7-b2a8-d28a70162388" containerName="main" Apr 17 16:51:16.576900 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.576879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.580697 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.580675 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 17 16:51:16.580839 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.580677 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:51:16.588802 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.588769 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz"] Apr 17 16:51:16.673156 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.673119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.673337 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.673182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6ns8\" (UniqueName: \"kubernetes.io/projected/23d2289e-b814-4824-89bf-48e716ed2e90-kube-api-access-x6ns8\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.673337 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.673279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.673337 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.673328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.673462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.673347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23d2289e-b814-4824-89bf-48e716ed2e90-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.673462 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.673418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.773923 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.773887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.773923 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.773934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6ns8\" (UniqueName: \"kubernetes.io/projected/23d2289e-b814-4824-89bf-48e716ed2e90-kube-api-access-x6ns8\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.774152 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.773987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.774152 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.774022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.774249 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.774232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23d2289e-b814-4824-89bf-48e716ed2e90-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.774308 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.774292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.774376 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.774331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.774446 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.774387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.774446 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.774421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.776482 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.776454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.776706 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.776691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23d2289e-b814-4824-89bf-48e716ed2e90-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.781658 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.781636 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6ns8\" (UniqueName: \"kubernetes.io/projected/23d2289e-b814-4824-89bf-48e716ed2e90-kube-api-access-x6ns8\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:16.891961 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:16.891857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:17.052031 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:17.032722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz"] Apr 17 16:51:17.054905 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:17.053456 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:51:17.308054 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:17.308019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" event={"ID":"23d2289e-b814-4824-89bf-48e716ed2e90","Type":"ContainerStarted","Data":"b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b"} Apr 17 16:51:17.308054 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:17.308055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" event={"ID":"23d2289e-b814-4824-89bf-48e716ed2e90","Type":"ContainerStarted","Data":"e9dfd442d7f70b5bbdd32d44a03ef8270edb70134ba4dc79d6620b49436e9c2a"} Apr 17 16:51:21.154013 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:21.153981 2576 scope.go:117] "RemoveContainer" containerID="64d7919752c48dcd335e1abae48269fffdf43ce644efb4f5fb4e20cb7478d2b5" Apr 17 16:51:21.183225 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:21.183207 2576 scope.go:117] "RemoveContainer" containerID="603a2d861dafeddac166196bde88057e21c364e7b5f6a06a5428eb746f8b2b72" Apr 17 16:51:21.190097 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:21.190079 2576 scope.go:117] "RemoveContainer" containerID="ff57497df483f4fbf00262c71495dd8ce808887df84021cf18039ef9cbf4741b" Apr 17 16:51:21.322225 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:21.322189 2576 generic.go:358] "Generic (PLEG): container finished" podID="23d2289e-b814-4824-89bf-48e716ed2e90" containerID="b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b" exitCode=0 Apr 17 16:51:21.322382 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:21.322263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" event={"ID":"23d2289e-b814-4824-89bf-48e716ed2e90","Type":"ContainerDied","Data":"b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b"} Apr 17 16:51:22.251637 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:22.251611 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:51:22.253171 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:22.253146 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:51:22.327752 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:22.327716 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" event={"ID":"23d2289e-b814-4824-89bf-48e716ed2e90","Type":"ContainerStarted","Data":"33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb"} Apr 17 16:51:22.348706 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:22.348640 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podStartSLOduration=6.348623654 podStartE2EDuration="6.348623654s" podCreationTimestamp="2026-04-17 16:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:51:22.34622797 +0000 UTC m=+1201.811410830" watchObservedRunningTime="2026-04-17 16:51:22.348623654 +0000 UTC m=+1201.813806515" Apr 17 16:51:26.892918 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:26.892881 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:26.892918 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:26.892928 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:51:26.894402 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:26.894365 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 17 16:51:36.893022 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:36.892975 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 17 16:51:46.893146 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:46.893102 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 17 16:51:56.187551 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.187491 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z"] Apr 17 16:51:56.191390 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.191365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.194043 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.194022 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 17 16:51:56.194151 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.194047 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-dv5pw\"" Apr 17 16:51:56.208567 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.208533 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z"] Apr 17 16:51:56.351922 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.351887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjjmr\" (UniqueName: \"kubernetes.io/projected/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kube-api-access-qjjmr\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.351922 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.351923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.352123 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.351963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.352123 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.352048 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.352123 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.352112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.352222 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.352142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.453312 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.453225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.453312 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.453272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.453312 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.453301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.453640 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.453319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.453640 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.453384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjjmr\" (UniqueName: \"kubernetes.io/projected/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kube-api-access-qjjmr\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.453640 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.453413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.453806 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.453771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.453874 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.453814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.453928 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.453883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.455685 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.455659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.455908 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.455891 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.464050 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.464020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjjmr\" (UniqueName: \"kubernetes.io/projected/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kube-api-access-qjjmr\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.501204 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.501165 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:51:56.655101 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.655059 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z"] Apr 17 16:51:56.892345 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:56.892253 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 17 16:51:57.454091 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:57.454055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" event={"ID":"d7367c63-fd0d-4d8f-8be7-5e3da99d702e","Type":"ContainerStarted","Data":"f343060bc7aa78dda643966e3bc649dc938e887687eff829e41132e3f2e724a9"} Apr 17 16:51:58.458513 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:58.458473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" event={"ID":"d7367c63-fd0d-4d8f-8be7-5e3da99d702e","Type":"ContainerStarted","Data":"902faf483dca71f775ff0d0a50713aec77b0d2549b93f4473a7fe73f2670143d"} Apr 17 16:51:59.464948 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:59.464907 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" event={"ID":"d7367c63-fd0d-4d8f-8be7-5e3da99d702e","Type":"ContainerStarted","Data":"b42c92b79dadb9ad87cd78c376a47510b4340d6363bb73d3c4265410d2aa7065"} Apr 17 16:51:59.465402 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:51:59.465217 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:52:02.478701 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:02.478663 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerID="b42c92b79dadb9ad87cd78c376a47510b4340d6363bb73d3c4265410d2aa7065" exitCode=0 Apr 17 16:52:02.479176 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:02.478741 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" event={"ID":"d7367c63-fd0d-4d8f-8be7-5e3da99d702e","Type":"ContainerDied","Data":"b42c92b79dadb9ad87cd78c376a47510b4340d6363bb73d3c4265410d2aa7065"} Apr 17 16:52:03.484362 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:03.484326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" event={"ID":"d7367c63-fd0d-4d8f-8be7-5e3da99d702e","Type":"ContainerStarted","Data":"e4d937acda83ff317a71e69fa6977d49d7f078c33ad98e8782afe381ec631e51"} Apr 17 16:52:03.512313 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:03.512261 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podStartSLOduration=6.690850041 podStartE2EDuration="7.512246129s" podCreationTimestamp="2026-04-17 16:51:56 +0000 UTC" firstStartedPulling="2026-04-17 16:51:56.661643482 +0000 UTC m=+1236.126826333" lastFinishedPulling="2026-04-17 16:51:57.483039574 +0000 UTC m=+1236.948222421" observedRunningTime="2026-04-17 16:52:03.509002154 +0000 UTC m=+1242.974185015" watchObservedRunningTime="2026-04-17 16:52:03.512246129 +0000 UTC m=+1242.977429027" Apr 17 16:52:06.502225 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:06.502138 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:52:06.502225 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:06.502185 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:52:06.503675 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:06.503645 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 17 16:52:06.893225 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:06.893128 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 17 16:52:16.502151 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:16.502090 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 17 16:52:16.522777 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:16.522747 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:52:16.893023 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:16.892922 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 17 16:52:26.501660 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:26.501604 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 17 16:52:26.893487 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:26.893386 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 17 16:52:36.502598 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:36.502533 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 17 16:52:36.893404 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:36.893298 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 17 16:52:46.502613 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:46.502559 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 17 16:52:46.892613 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:46.892508 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 17 16:52:56.502390 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:56.502334 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 17 16:52:56.902891 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:56.902860 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:52:56.910754 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:52:56.910729 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:53:05.115388 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:05.115351 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz"] Apr 17 16:53:05.115876 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:05.115752 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" containerID="cri-o://33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb" gracePeriod=30 Apr 17 16:53:06.502131 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:06.502077 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 17 16:53:16.502021 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:16.501970 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 17 16:53:22.468146 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.468073 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 16:53:22.473311 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.473284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.475874 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.475850 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 17 16:53:22.476721 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.476704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-xmkgh\"" Apr 17 16:53:22.480927 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.480899 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 16:53:22.505241 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.505208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.505398 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.505361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.505398 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.505392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.505470 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.505410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.505470 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.505427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.505470 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.505465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6xjr\" (UniqueName: \"kubernetes.io/projected/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kube-api-access-n6xjr\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.606921 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.606886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.607092 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.606931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.607092 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.606955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.607092 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.606971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.607092 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.606993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6xjr\" (UniqueName: \"kubernetes.io/projected/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kube-api-access-n6xjr\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.607092 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.607028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.607341 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.607315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.607399 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.607362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.607399 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.607385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.609275 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.609249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.609473 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.609457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.618337 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.618315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6xjr\" (UniqueName: \"kubernetes.io/projected/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kube-api-access-n6xjr\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.786390 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.786320 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:53:22.927172 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:22.927131 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 16:53:22.930259 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:53:22.930226 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e49fc76_4d4d_4c73_8c5f_16b5a169fefc.slice/crio-d25427a9aed32510dd642604e9023eb4c4ccbc687d0a99aa282896dc24b13a1c WatchSource:0}: Error finding container d25427a9aed32510dd642604e9023eb4c4ccbc687d0a99aa282896dc24b13a1c: Status 404 returned error can't find the container with id d25427a9aed32510dd642604e9023eb4c4ccbc687d0a99aa282896dc24b13a1c Apr 17 16:53:23.789689 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:23.789651 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc","Type":"ContainerStarted","Data":"47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6"} Apr 17 16:53:23.789689 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:23.789695 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc","Type":"ContainerStarted","Data":"d25427a9aed32510dd642604e9023eb4c4ccbc687d0a99aa282896dc24b13a1c"} Apr 17 16:53:26.502980 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:26.502573 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 17 16:53:27.807222 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:27.807177 2576 generic.go:358] "Generic (PLEG): container finished" podID="0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" containerID="47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6" exitCode=0 Apr 17 16:53:27.807222 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:27.807228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc","Type":"ContainerDied","Data":"47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6"} Apr 17 16:53:28.817278 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:28.817237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc","Type":"ContainerStarted","Data":"e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8"} Apr 17 16:53:28.837816 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:28.837757 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.8377401330000005 podStartE2EDuration="6.837740133s" podCreationTimestamp="2026-04-17 16:53:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:28.834697339 +0000 UTC m=+1328.299880227" watchObservedRunningTime="2026-04-17 16:53:28.837740133 +0000 UTC m=+1328.302922994" Apr 17 16:53:35.460604 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.460571 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz_23d2289e-b814-4824-89bf-48e716ed2e90/main/0.log" Apr 17 16:53:35.461034 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.461013 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:53:35.541121 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.541089 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-kserve-provision-location\") pod \"23d2289e-b814-4824-89bf-48e716ed2e90\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " Apr 17 16:53:35.541121 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.541129 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-dshm\") pod \"23d2289e-b814-4824-89bf-48e716ed2e90\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " Apr 17 16:53:35.541384 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.541295 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-home\") pod \"23d2289e-b814-4824-89bf-48e716ed2e90\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " Apr 17 16:53:35.541384 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.541323 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6ns8\" (UniqueName: \"kubernetes.io/projected/23d2289e-b814-4824-89bf-48e716ed2e90-kube-api-access-x6ns8\") pod \"23d2289e-b814-4824-89bf-48e716ed2e90\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " Apr 17 16:53:35.541384 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.541352 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-model-cache\") pod \"23d2289e-b814-4824-89bf-48e716ed2e90\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " Apr 17 16:53:35.541384 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.541378 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23d2289e-b814-4824-89bf-48e716ed2e90-tls-certs\") pod \"23d2289e-b814-4824-89bf-48e716ed2e90\" (UID: \"23d2289e-b814-4824-89bf-48e716ed2e90\") " Apr 17 16:53:35.541878 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.541784 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-home" (OuterVolumeSpecName: "home") pod "23d2289e-b814-4824-89bf-48e716ed2e90" (UID: "23d2289e-b814-4824-89bf-48e716ed2e90"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:53:35.541878 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.541812 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-model-cache" (OuterVolumeSpecName: "model-cache") pod "23d2289e-b814-4824-89bf-48e716ed2e90" (UID: "23d2289e-b814-4824-89bf-48e716ed2e90"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:53:35.544020 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.543990 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-dshm" (OuterVolumeSpecName: "dshm") pod "23d2289e-b814-4824-89bf-48e716ed2e90" (UID: "23d2289e-b814-4824-89bf-48e716ed2e90"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:53:35.544128 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.544105 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d2289e-b814-4824-89bf-48e716ed2e90-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "23d2289e-b814-4824-89bf-48e716ed2e90" (UID: "23d2289e-b814-4824-89bf-48e716ed2e90"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:53:35.544169 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.544122 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d2289e-b814-4824-89bf-48e716ed2e90-kube-api-access-x6ns8" (OuterVolumeSpecName: "kube-api-access-x6ns8") pod "23d2289e-b814-4824-89bf-48e716ed2e90" (UID: "23d2289e-b814-4824-89bf-48e716ed2e90"). InnerVolumeSpecName "kube-api-access-x6ns8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:53:35.616483 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.616431 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "23d2289e-b814-4824-89bf-48e716ed2e90" (UID: "23d2289e-b814-4824-89bf-48e716ed2e90"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:53:35.642756 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.642717 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-home\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:53:35.642756 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.642750 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6ns8\" (UniqueName: \"kubernetes.io/projected/23d2289e-b814-4824-89bf-48e716ed2e90-kube-api-access-x6ns8\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:53:35.642756 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.642765 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-model-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:53:35.642989 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.642778 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23d2289e-b814-4824-89bf-48e716ed2e90-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:53:35.642989 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.642791 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:53:35.642989 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.642802 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/23d2289e-b814-4824-89bf-48e716ed2e90-dshm\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:53:35.846918 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.846832 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz_23d2289e-b814-4824-89bf-48e716ed2e90/main/0.log" Apr 17 16:53:35.847267 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.847232 2576 generic.go:358] "Generic (PLEG): container finished" podID="23d2289e-b814-4824-89bf-48e716ed2e90" containerID="33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb" exitCode=137 Apr 17 16:53:35.847406 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.847306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" event={"ID":"23d2289e-b814-4824-89bf-48e716ed2e90","Type":"ContainerDied","Data":"33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb"} Apr 17 16:53:35.847406 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.847322 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" Apr 17 16:53:35.847406 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.847338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz" event={"ID":"23d2289e-b814-4824-89bf-48e716ed2e90","Type":"ContainerDied","Data":"e9dfd442d7f70b5bbdd32d44a03ef8270edb70134ba4dc79d6620b49436e9c2a"} Apr 17 16:53:35.847406 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.847362 2576 scope.go:117] "RemoveContainer" containerID="33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb" Apr 17 16:53:35.868751 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.868722 2576 scope.go:117] "RemoveContainer" containerID="b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b" Apr 17 16:53:35.875246 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.875211 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz"] Apr 17 16:53:35.879546 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.879472 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-66458ffd79bfzlz"] Apr 17 16:53:35.884397 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.884378 2576 scope.go:117] "RemoveContainer" containerID="33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb" Apr 17 16:53:35.884831 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:53:35.884804 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb\": container with ID starting with 33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb not found: ID does not exist" containerID="33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb" Apr 17 16:53:35.884913 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.884847 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb"} err="failed to get container status \"33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb\": rpc error: code = NotFound desc = could not find container \"33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb\": container with ID starting with 33df8cf5bbcaca697261effd527f2f8bf173990af82c539e04a6bc47f57cbbeb not found: ID does not exist" Apr 17 16:53:35.884913 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.884891 2576 scope.go:117] "RemoveContainer" containerID="b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b" Apr 17 16:53:35.885190 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:53:35.885170 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b\": container with ID starting with b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b not found: ID does not exist" containerID="b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b" Apr 17 16:53:35.885261 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:35.885199 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b"} err="failed to get container status \"b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b\": rpc error: code = NotFound desc = could not find container \"b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b\": container with ID starting with b27687c96fb184750e8ce3a16031eff15709675b9ca4b4c8809e9d002507a62b not found: ID does not exist" Apr 17 16:53:36.502592 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:36.502546 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 17 16:53:37.142992 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:37.142948 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" path="/var/lib/kubelet/pods/23d2289e-b814-4824-89bf-48e716ed2e90/volumes" Apr 17 16:53:46.511225 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:46.511192 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:53:46.536949 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:46.536875 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:53:58.875754 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:58.875718 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z"] Apr 17 16:53:58.877346 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:53:58.876828 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" containerID="cri-o://e4d937acda83ff317a71e69fa6977d49d7f078c33ad98e8782afe381ec631e51" gracePeriod=30 Apr 17 16:54:05.643181 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.643144 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz"] Apr 17 16:54:05.643761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.643597 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="storage-initializer" Apr 17 16:54:05.643761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.643613 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="storage-initializer" Apr 17 16:54:05.643761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.643640 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" Apr 17 16:54:05.643761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.643646 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" Apr 17 16:54:05.643761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.643712 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="23d2289e-b814-4824-89bf-48e716ed2e90" containerName="main" Apr 17 16:54:05.648543 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.648521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.652241 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.652216 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 17 16:54:05.656471 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.656446 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz"] Apr 17 16:54:05.739122 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.739083 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.739309 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.739126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27rk\" (UniqueName: \"kubernetes.io/projected/5e2d9378-474e-490a-92bb-3329c5ab7825-kube-api-access-c27rk\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.739309 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.739156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.739309 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.739227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.739309 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.739291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d9378-474e-490a-92bb-3329c5ab7825-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.739483 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.739342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.840246 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.840207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d9378-474e-490a-92bb-3329c5ab7825-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.840432 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.840297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.840432 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.840351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.840432 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.840379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c27rk\" (UniqueName: \"kubernetes.io/projected/5e2d9378-474e-490a-92bb-3329c5ab7825-kube-api-access-c27rk\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.840432 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.840410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.840677 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.840477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.840768 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.840742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.840853 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.840800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.840905 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.840888 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.842883 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.842849 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d9378-474e-490a-92bb-3329c5ab7825-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.843268 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.842921 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.859943 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.859919 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27rk\" (UniqueName: \"kubernetes.io/projected/5e2d9378-474e-490a-92bb-3329c5ab7825-kube-api-access-c27rk\") pod \"custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:05.963383 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:05.963340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:06.106464 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:06.106389 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz"] Apr 17 16:54:06.108928 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:54:06.108895 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e2d9378_474e_490a_92bb_3329c5ab7825.slice/crio-8d22ccaa43684291275ccf0f23dd02b420aef4b75a2df2537d70e82593a7ba44 WatchSource:0}: Error finding container 8d22ccaa43684291275ccf0f23dd02b420aef4b75a2df2537d70e82593a7ba44: Status 404 returned error can't find the container with id 8d22ccaa43684291275ccf0f23dd02b420aef4b75a2df2537d70e82593a7ba44 Apr 17 16:54:06.969507 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:06.969461 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" event={"ID":"5e2d9378-474e-490a-92bb-3329c5ab7825","Type":"ContainerStarted","Data":"a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744"} Apr 17 16:54:06.969877 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:06.969522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" event={"ID":"5e2d9378-474e-490a-92bb-3329c5ab7825","Type":"ContainerStarted","Data":"8d22ccaa43684291275ccf0f23dd02b420aef4b75a2df2537d70e82593a7ba44"} Apr 17 16:54:10.986961 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:10.986926 2576 generic.go:358] "Generic (PLEG): container finished" podID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerID="a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744" exitCode=0 Apr 17 16:54:10.987424 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:10.986976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" event={"ID":"5e2d9378-474e-490a-92bb-3329c5ab7825","Type":"ContainerDied","Data":"a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744"} Apr 17 16:54:11.992370 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:11.992330 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" event={"ID":"5e2d9378-474e-490a-92bb-3329c5ab7825","Type":"ContainerStarted","Data":"43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b"} Apr 17 16:54:12.013096 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:12.013034 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podStartSLOduration=7.013009835 podStartE2EDuration="7.013009835s" podCreationTimestamp="2026-04-17 16:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:54:12.011229363 +0000 UTC m=+1371.476412249" watchObservedRunningTime="2026-04-17 16:54:12.013009835 +0000 UTC m=+1371.478192703" Apr 17 16:54:15.964029 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:15.963978 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:15.964408 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:15.964041 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:54:15.965655 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:15.965627 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 17 16:54:25.964088 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:25.964036 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 17 16:54:28.877883 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:28.877821 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="llm-d-routing-sidecar" containerID="cri-o://902faf483dca71f775ff0d0a50713aec77b0d2549b93f4473a7fe73f2670143d" gracePeriod=2 Apr 17 16:54:29.071662 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.071634 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z_d7367c63-fd0d-4d8f-8be7-5e3da99d702e/main/0.log" Apr 17 16:54:29.072428 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.072397 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerID="e4d937acda83ff317a71e69fa6977d49d7f078c33ad98e8782afe381ec631e51" exitCode=137 Apr 17 16:54:29.072428 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.072421 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerID="902faf483dca71f775ff0d0a50713aec77b0d2549b93f4473a7fe73f2670143d" exitCode=0 Apr 17 16:54:29.072665 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.072448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" event={"ID":"d7367c63-fd0d-4d8f-8be7-5e3da99d702e","Type":"ContainerDied","Data":"e4d937acda83ff317a71e69fa6977d49d7f078c33ad98e8782afe381ec631e51"} Apr 17 16:54:29.072665 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.072506 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" event={"ID":"d7367c63-fd0d-4d8f-8be7-5e3da99d702e","Type":"ContainerDied","Data":"902faf483dca71f775ff0d0a50713aec77b0d2549b93f4473a7fe73f2670143d"} Apr 17 16:54:29.173022 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.172999 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z_d7367c63-fd0d-4d8f-8be7-5e3da99d702e/main/0.log" Apr 17 16:54:29.173768 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.173742 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:54:29.269074 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.269039 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-tls-certs\") pod \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " Apr 17 16:54:29.269250 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.269091 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-model-cache\") pod \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " Apr 17 16:54:29.269250 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.269213 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-home\") pod \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " Apr 17 16:54:29.269371 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.269254 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-dshm\") pod \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " Apr 17 16:54:29.269614 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.269377 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-model-cache" (OuterVolumeSpecName: "model-cache") pod "d7367c63-fd0d-4d8f-8be7-5e3da99d702e" (UID: "d7367c63-fd0d-4d8f-8be7-5e3da99d702e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:29.269614 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.269388 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjjmr\" (UniqueName: \"kubernetes.io/projected/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kube-api-access-qjjmr\") pod \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " Apr 17 16:54:29.269614 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.269433 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kserve-provision-location\") pod \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\" (UID: \"d7367c63-fd0d-4d8f-8be7-5e3da99d702e\") " Apr 17 16:54:29.269792 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.269694 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-home" (OuterVolumeSpecName: "home") pod "d7367c63-fd0d-4d8f-8be7-5e3da99d702e" (UID: "d7367c63-fd0d-4d8f-8be7-5e3da99d702e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:29.269832 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.269801 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-model-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.269832 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.269822 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-home\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.272045 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.272016 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-dshm" (OuterVolumeSpecName: "dshm") pod "d7367c63-fd0d-4d8f-8be7-5e3da99d702e" (UID: "d7367c63-fd0d-4d8f-8be7-5e3da99d702e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:29.272217 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.272194 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kube-api-access-qjjmr" (OuterVolumeSpecName: "kube-api-access-qjjmr") pod "d7367c63-fd0d-4d8f-8be7-5e3da99d702e" (UID: "d7367c63-fd0d-4d8f-8be7-5e3da99d702e"). InnerVolumeSpecName "kube-api-access-qjjmr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:29.272416 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.272386 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d7367c63-fd0d-4d8f-8be7-5e3da99d702e" (UID: "d7367c63-fd0d-4d8f-8be7-5e3da99d702e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:29.323643 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.323592 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d7367c63-fd0d-4d8f-8be7-5e3da99d702e" (UID: "d7367c63-fd0d-4d8f-8be7-5e3da99d702e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:29.370438 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.370404 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-dshm\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.370438 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.370433 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qjjmr\" (UniqueName: \"kubernetes.io/projected/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kube-api-access-qjjmr\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.370438 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.370445 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:54:29.370676 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:29.370456 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7367c63-fd0d-4d8f-8be7-5e3da99d702e-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:54:30.077823 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:30.077793 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z_d7367c63-fd0d-4d8f-8be7-5e3da99d702e/main/0.log" Apr 17 16:54:30.078566 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:30.078536 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" event={"ID":"d7367c63-fd0d-4d8f-8be7-5e3da99d702e","Type":"ContainerDied","Data":"f343060bc7aa78dda643966e3bc649dc938e887687eff829e41132e3f2e724a9"} Apr 17 16:54:30.078700 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:30.078585 2576 scope.go:117] "RemoveContainer" containerID="e4d937acda83ff317a71e69fa6977d49d7f078c33ad98e8782afe381ec631e51" Apr 17 16:54:30.078700 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:30.078598 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z" Apr 17 16:54:30.099619 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:30.099587 2576 scope.go:117] "RemoveContainer" containerID="b42c92b79dadb9ad87cd78c376a47510b4340d6363bb73d3c4265410d2aa7065" Apr 17 16:54:30.105813 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:30.105779 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z"] Apr 17 16:54:30.109917 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:30.109877 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7cbfdd565bncf4z"] Apr 17 16:54:30.112574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:30.112547 2576 scope.go:117] "RemoveContainer" containerID="902faf483dca71f775ff0d0a50713aec77b0d2549b93f4473a7fe73f2670143d" Apr 17 16:54:31.151921 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:31.151880 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" path="/var/lib/kubelet/pods/d7367c63-fd0d-4d8f-8be7-5e3da99d702e/volumes" Apr 17 16:54:35.964515 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:35.964457 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 17 16:54:45.964372 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:45.964316 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 17 16:54:55.964809 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:54:55.964755 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 17 16:55:05.964115 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:05.964019 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 17 16:55:14.961699 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:14.961660 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 16:55:14.962197 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:14.961962 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" containerName="main" containerID="cri-o://e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8" gracePeriod=30 Apr 17 16:55:15.964264 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:15.964222 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 17 16:55:16.107124 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.107100 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:55:16.218941 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.218908 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-home\") pod \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " Apr 17 16:55:16.219123 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.218999 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6xjr\" (UniqueName: \"kubernetes.io/projected/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kube-api-access-n6xjr\") pod \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " Apr 17 16:55:16.219123 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.219021 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-tls-certs\") pod \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " Apr 17 16:55:16.219123 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.219041 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-dshm\") pod \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " Apr 17 16:55:16.219123 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.219084 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kserve-provision-location\") pod \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " Apr 17 16:55:16.219347 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.219146 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-model-cache\") pod \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\" (UID: \"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc\") " Apr 17 16:55:16.219347 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.219292 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-home" (OuterVolumeSpecName: "home") pod "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" (UID: "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:16.219490 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.219468 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-model-cache" (OuterVolumeSpecName: "model-cache") pod "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" (UID: "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:16.219638 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.219620 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-home\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:16.219707 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.219646 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-model-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:16.221194 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.221161 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-dshm" (OuterVolumeSpecName: "dshm") pod "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" (UID: "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:16.221314 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.221252 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" (UID: "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:16.221314 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.221260 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kube-api-access-n6xjr" (OuterVolumeSpecName: "kube-api-access-n6xjr") pod "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" (UID: "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc"). InnerVolumeSpecName "kube-api-access-n6xjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:55:16.261365 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.261330 2576 generic.go:358] "Generic (PLEG): container finished" podID="0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" containerID="e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8" exitCode=0 Apr 17 16:55:16.261557 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.261402 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 16:55:16.261557 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.261411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc","Type":"ContainerDied","Data":"e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8"} Apr 17 16:55:16.261557 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.261453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"0e49fc76-4d4d-4c73-8c5f-16b5a169fefc","Type":"ContainerDied","Data":"d25427a9aed32510dd642604e9023eb4c4ccbc687d0a99aa282896dc24b13a1c"} Apr 17 16:55:16.261557 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.261474 2576 scope.go:117] "RemoveContainer" containerID="e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8" Apr 17 16:55:16.289196 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.289127 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" (UID: "0e49fc76-4d4d-4c73-8c5f-16b5a169fefc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:16.290361 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.290327 2576 scope.go:117] "RemoveContainer" containerID="47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6" Apr 17 16:55:16.320809 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.320777 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n6xjr\" (UniqueName: \"kubernetes.io/projected/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kube-api-access-n6xjr\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:16.320809 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.320806 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:16.320809 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.320815 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-dshm\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:16.321020 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.320825 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:16.350352 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.350330 2576 scope.go:117] "RemoveContainer" containerID="e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8" Apr 17 16:55:16.350715 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:55:16.350682 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8\": container with ID starting with e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8 not found: ID does not exist" containerID="e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8" Apr 17 16:55:16.350804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.350728 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8"} err="failed to get container status \"e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8\": rpc error: code = NotFound desc = could not find container \"e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8\": container with ID starting with e23ca0b2de9a7d2cad9d002a72bcc1ebf5c628f011be656ea70d15bc19e178c8 not found: ID does not exist" Apr 17 16:55:16.350804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.350757 2576 scope.go:117] "RemoveContainer" containerID="47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6" Apr 17 16:55:16.351042 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:55:16.351022 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6\": container with ID starting with 47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6 not found: ID does not exist" containerID="47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6" Apr 17 16:55:16.351120 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.351051 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6"} err="failed to get container status \"47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6\": rpc error: code = NotFound desc = could not find container \"47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6\": container with ID starting with 47478a3f62a6fa5d05229e10a37759f2d9f46fb94db9eb18f314c1a0f8105eb6 not found: ID does not exist" Apr 17 16:55:16.585316 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.585287 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 16:55:16.587987 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:16.587956 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 16:55:17.143904 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:17.143874 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" path="/var/lib/kubelet/pods/0e49fc76-4d4d-4c73-8c5f-16b5a169fefc/volumes" Apr 17 16:55:24.272946 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.272901 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7"] Apr 17 16:55:24.273533 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273483 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" containerName="storage-initializer" Apr 17 16:55:24.273533 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273519 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" containerName="storage-initializer" Apr 17 16:55:24.273533 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273534 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" Apr 17 16:55:24.273759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273542 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" Apr 17 16:55:24.273759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273553 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="llm-d-routing-sidecar" Apr 17 16:55:24.273759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273561 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="llm-d-routing-sidecar" Apr 17 16:55:24.273759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273572 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="storage-initializer" Apr 17 16:55:24.273759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273580 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="storage-initializer" Apr 17 16:55:24.273759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273600 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" containerName="main" Apr 17 16:55:24.273759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273609 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" containerName="main" Apr 17 16:55:24.273759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273708 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="llm-d-routing-sidecar" Apr 17 16:55:24.273759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273723 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7367c63-fd0d-4d8f-8be7-5e3da99d702e" containerName="main" Apr 17 16:55:24.273759 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.273739 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e49fc76-4d4d-4c73-8c5f-16b5a169fefc" containerName="main" Apr 17 16:55:24.279390 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.279364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.284092 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.284066 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 17 16:55:24.291054 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.291027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7"] Apr 17 16:55:24.389630 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.389585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-dshm\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.389630 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.389637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-home\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.389848 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.389658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9wj\" (UniqueName: \"kubernetes.io/projected/60bf0966-ac08-4072-90c7-4418d8d8917f-kube-api-access-wg9wj\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.389848 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.389738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60bf0966-ac08-4072-90c7-4418d8d8917f-tls-certs\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.389848 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.389763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-model-cache\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.389848 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.389836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.490590 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.490547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60bf0966-ac08-4072-90c7-4418d8d8917f-tls-certs\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.490792 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.490599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-model-cache\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.490792 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.490648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.490904 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.490845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-dshm\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.490904 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.490888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-home\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.491011 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.490915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9wj\" (UniqueName: \"kubernetes.io/projected/60bf0966-ac08-4072-90c7-4418d8d8917f-kube-api-access-wg9wj\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.491127 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.491106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.491198 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.491158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-model-cache\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.491332 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.491301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-home\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.493004 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.492977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-dshm\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.493230 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.493212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60bf0966-ac08-4072-90c7-4418d8d8917f-tls-certs\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.499407 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.499384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9wj\" (UniqueName: \"kubernetes.io/projected/60bf0966-ac08-4072-90c7-4418d8d8917f-kube-api-access-wg9wj\") pod \"scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.592536 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.592413 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:24.760469 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:24.760437 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7"] Apr 17 16:55:24.764170 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:55:24.764134 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60bf0966_ac08_4072_90c7_4418d8d8917f.slice/crio-b009c7f5bfeefaa029b64725cc82e7b893adf27c86140d1eb6776ae1d887f210 WatchSource:0}: Error finding container b009c7f5bfeefaa029b64725cc82e7b893adf27c86140d1eb6776ae1d887f210: Status 404 returned error can't find the container with id b009c7f5bfeefaa029b64725cc82e7b893adf27c86140d1eb6776ae1d887f210 Apr 17 16:55:25.300427 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:25.300393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" event={"ID":"60bf0966-ac08-4072-90c7-4418d8d8917f","Type":"ContainerStarted","Data":"68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127"} Apr 17 16:55:25.300427 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:25.300433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" event={"ID":"60bf0966-ac08-4072-90c7-4418d8d8917f","Type":"ContainerStarted","Data":"b009c7f5bfeefaa029b64725cc82e7b893adf27c86140d1eb6776ae1d887f210"} Apr 17 16:55:25.964570 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:25.964466 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 17 16:55:29.316412 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:29.316377 2576 generic.go:358] "Generic (PLEG): container finished" podID="60bf0966-ac08-4072-90c7-4418d8d8917f" containerID="68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127" exitCode=0 Apr 17 16:55:29.316920 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:29.316424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" event={"ID":"60bf0966-ac08-4072-90c7-4418d8d8917f","Type":"ContainerDied","Data":"68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127"} Apr 17 16:55:30.321802 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:30.321763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" event={"ID":"60bf0966-ac08-4072-90c7-4418d8d8917f","Type":"ContainerStarted","Data":"88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757"} Apr 17 16:55:30.339200 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:30.339148 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" podStartSLOduration=6.339130594 podStartE2EDuration="6.339130594s" podCreationTimestamp="2026-04-17 16:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:55:30.337810788 +0000 UTC m=+1449.802993649" watchObservedRunningTime="2026-04-17 16:55:30.339130594 +0000 UTC m=+1449.804313493" Apr 17 16:55:34.592770 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:34.592728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:34.592770 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:34.592781 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:34.605403 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:34.605379 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:35.356601 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:35.356567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:35.964844 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:35.964801 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 17 16:55:45.973681 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:45.973643 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:55:45.981522 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:45.981484 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:55:57.684486 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:57.684447 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7"] Apr 17 16:55:57.684965 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:57.684867 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" podUID="60bf0966-ac08-4072-90c7-4418d8d8917f" containerName="main" containerID="cri-o://88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757" gracePeriod=30 Apr 17 16:55:57.937575 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:57.937474 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:58.014879 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.014836 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-dshm\") pod \"60bf0966-ac08-4072-90c7-4418d8d8917f\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " Apr 17 16:55:58.014879 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.014874 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60bf0966-ac08-4072-90c7-4418d8d8917f-tls-certs\") pod \"60bf0966-ac08-4072-90c7-4418d8d8917f\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " Apr 17 16:55:58.015143 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.014898 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg9wj\" (UniqueName: \"kubernetes.io/projected/60bf0966-ac08-4072-90c7-4418d8d8917f-kube-api-access-wg9wj\") pod \"60bf0966-ac08-4072-90c7-4418d8d8917f\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " Apr 17 16:55:58.015143 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.014950 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-kserve-provision-location\") pod \"60bf0966-ac08-4072-90c7-4418d8d8917f\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " Apr 17 16:55:58.015143 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.014988 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-home\") pod \"60bf0966-ac08-4072-90c7-4418d8d8917f\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " Apr 17 16:55:58.015143 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.015032 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-model-cache\") pod \"60bf0966-ac08-4072-90c7-4418d8d8917f\" (UID: \"60bf0966-ac08-4072-90c7-4418d8d8917f\") " Apr 17 16:55:58.015512 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.015471 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-model-cache" (OuterVolumeSpecName: "model-cache") pod "60bf0966-ac08-4072-90c7-4418d8d8917f" (UID: "60bf0966-ac08-4072-90c7-4418d8d8917f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:58.015617 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.015492 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-home" (OuterVolumeSpecName: "home") pod "60bf0966-ac08-4072-90c7-4418d8d8917f" (UID: "60bf0966-ac08-4072-90c7-4418d8d8917f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:58.017108 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.017070 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bf0966-ac08-4072-90c7-4418d8d8917f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "60bf0966-ac08-4072-90c7-4418d8d8917f" (UID: "60bf0966-ac08-4072-90c7-4418d8d8917f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:58.017506 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.017473 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-dshm" (OuterVolumeSpecName: "dshm") pod "60bf0966-ac08-4072-90c7-4418d8d8917f" (UID: "60bf0966-ac08-4072-90c7-4418d8d8917f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:58.017679 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.017655 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60bf0966-ac08-4072-90c7-4418d8d8917f-kube-api-access-wg9wj" (OuterVolumeSpecName: "kube-api-access-wg9wj") pod "60bf0966-ac08-4072-90c7-4418d8d8917f" (UID: "60bf0966-ac08-4072-90c7-4418d8d8917f"). InnerVolumeSpecName "kube-api-access-wg9wj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:55:58.074650 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.074596 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "60bf0966-ac08-4072-90c7-4418d8d8917f" (UID: "60bf0966-ac08-4072-90c7-4418d8d8917f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:58.116561 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.116515 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-dshm\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:58.116561 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.116565 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/60bf0966-ac08-4072-90c7-4418d8d8917f-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:58.116761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.116578 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wg9wj\" (UniqueName: \"kubernetes.io/projected/60bf0966-ac08-4072-90c7-4418d8d8917f-kube-api-access-wg9wj\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:58.116761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.116588 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:58.116761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.116598 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-home\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:58.116761 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.116606 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/60bf0966-ac08-4072-90c7-4418d8d8917f-model-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:55:58.438435 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.438396 2576 generic.go:358] "Generic (PLEG): container finished" podID="60bf0966-ac08-4072-90c7-4418d8d8917f" containerID="88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757" exitCode=0 Apr 17 16:55:58.438628 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.438463 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" Apr 17 16:55:58.438628 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.438462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" event={"ID":"60bf0966-ac08-4072-90c7-4418d8d8917f","Type":"ContainerDied","Data":"88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757"} Apr 17 16:55:58.438628 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.438536 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7" event={"ID":"60bf0966-ac08-4072-90c7-4418d8d8917f","Type":"ContainerDied","Data":"b009c7f5bfeefaa029b64725cc82e7b893adf27c86140d1eb6776ae1d887f210"} Apr 17 16:55:58.438628 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.438553 2576 scope.go:117] "RemoveContainer" containerID="88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757" Apr 17 16:55:58.447419 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.447401 2576 scope.go:117] "RemoveContainer" containerID="68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127" Apr 17 16:55:58.459579 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.459473 2576 scope.go:117] "RemoveContainer" containerID="88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757" Apr 17 16:55:58.459900 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:55:58.459872 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757\": container with ID starting with 88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757 not found: ID does not exist" containerID="88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757" Apr 17 16:55:58.459996 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.459912 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757"} err="failed to get container status \"88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757\": rpc error: code = NotFound desc = could not find container \"88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757\": container with ID starting with 88c98acdc90635b20a83b0f1395dc9308e8a0c6bbec62900e767eafd60e71757 not found: ID does not exist" Apr 17 16:55:58.459996 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.459937 2576 scope.go:117] "RemoveContainer" containerID="68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127" Apr 17 16:55:58.460307 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:55:58.460286 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127\": container with ID starting with 68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127 not found: ID does not exist" containerID="68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127" Apr 17 16:55:58.460368 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.460314 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127"} err="failed to get container status \"68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127\": rpc error: code = NotFound desc = could not find container \"68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127\": container with ID starting with 68e8894b3954c6f137bce7b3237fc23b5e01e0dc1e7586f6c06c5a7abe584127 not found: ID does not exist" Apr 17 16:55:58.462123 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.462098 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7"] Apr 17 16:55:58.463731 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:58.463702 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6dbcf4cb54-kgnh7"] Apr 17 16:55:59.143628 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:55:59.143596 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60bf0966-ac08-4072-90c7-4418d8d8917f" path="/var/lib/kubelet/pods/60bf0966-ac08-4072-90c7-4418d8d8917f/volumes" Apr 17 16:56:07.144567 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.144528 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz"] Apr 17 16:56:07.145104 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.144799 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" containerID="cri-o://43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b" gracePeriod=30 Apr 17 16:56:07.148884 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.148862 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc"] Apr 17 16:56:07.149218 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.149207 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60bf0966-ac08-4072-90c7-4418d8d8917f" containerName="main" Apr 17 16:56:07.149260 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.149221 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bf0966-ac08-4072-90c7-4418d8d8917f" containerName="main" Apr 17 16:56:07.149260 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.149240 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60bf0966-ac08-4072-90c7-4418d8d8917f" containerName="storage-initializer" Apr 17 16:56:07.149260 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.149246 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bf0966-ac08-4072-90c7-4418d8d8917f" containerName="storage-initializer" Apr 17 16:56:07.149357 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.149306 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="60bf0966-ac08-4072-90c7-4418d8d8917f" containerName="main" Apr 17 16:56:07.152711 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.152695 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.155386 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.155365 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-swnxm\"" Apr 17 16:56:07.164428 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.164402 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc"] Apr 17 16:56:07.304804 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.304767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.304974 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.304876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.304974 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.304918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.304974 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.304961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.305116 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.304998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.305116 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.305060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.305116 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.305111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.305263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.305216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.305263 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.305244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59lbq\" (UniqueName: \"kubernetes.io/projected/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-kube-api-access-59lbq\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406233 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406439 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406811 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406811 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59lbq\" (UniqueName: \"kubernetes.io/projected/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-kube-api-access-59lbq\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406811 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.406811 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.407022 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.406877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.407141 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.407115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.407336 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.407319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.408714 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.408693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.409204 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.409175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.415948 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.415919 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.416226 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.416182 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59lbq\" (UniqueName: \"kubernetes.io/projected/307ec0af-e668-4251-b9fd-8ffc80f9a0bd-kube-api-access-59lbq\") pod \"router-gateway-2-openshift-default-6866b85949-mvtwc\" (UID: \"307ec0af-e668-4251-b9fd-8ffc80f9a0bd\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.466205 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.466172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:07.598342 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.598243 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc"] Apr 17 16:56:07.600750 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:56:07.600713 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod307ec0af_e668_4251_b9fd_8ffc80f9a0bd.slice/crio-aef5ba4210ad5883aaf59886f8d2733b018e436f73189f3b052d87858cd3106b WatchSource:0}: Error finding container aef5ba4210ad5883aaf59886f8d2733b018e436f73189f3b052d87858cd3106b: Status 404 returned error can't find the container with id aef5ba4210ad5883aaf59886f8d2733b018e436f73189f3b052d87858cd3106b Apr 17 16:56:07.602904 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.602875 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:56:07.602994 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.602938 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:56:07.602994 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:07.602965 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:56:08.475859 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:08.475824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" event={"ID":"307ec0af-e668-4251-b9fd-8ffc80f9a0bd","Type":"ContainerStarted","Data":"1e290dbb6045751395664a095b4a05ec8d3f17bd9b5d99b5218ffb0f0853a84c"} Apr 17 16:56:08.476243 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:08.475864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" event={"ID":"307ec0af-e668-4251-b9fd-8ffc80f9a0bd","Type":"ContainerStarted","Data":"aef5ba4210ad5883aaf59886f8d2733b018e436f73189f3b052d87858cd3106b"} Apr 17 16:56:08.500307 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:08.500256 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" podStartSLOduration=1.5002408109999998 podStartE2EDuration="1.500240811s" podCreationTimestamp="2026-04-17 16:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:56:08.498611751 +0000 UTC m=+1487.963794612" watchObservedRunningTime="2026-04-17 16:56:08.500240811 +0000 UTC m=+1487.965423745" Apr 17 16:56:09.466580 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:09.466543 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:10.466847 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:10.466796 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" podUID="307ec0af-e668-4251-b9fd-8ffc80f9a0bd" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.44:15021/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 17 16:56:10.535317 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:10.535283 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:11.486593 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:11.486555 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:11.487405 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:11.487383 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-mvtwc" Apr 17 16:56:15.537720 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.537689 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt"] Apr 17 16:56:15.541617 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.541589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.544005 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.543983 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 17 16:56:15.553769 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.553738 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt"] Apr 17 16:56:15.581179 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.581148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5krg\" (UniqueName: \"kubernetes.io/projected/b81d669e-69b2-4121-bfaf-bacf7bb56538-kube-api-access-h5krg\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.581360 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.581207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.581360 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.581265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.581444 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.581360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b81d669e-69b2-4121-bfaf-bacf7bb56538-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.581444 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.581410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.581535 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.581450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-home\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.682653 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.682616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-home\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.682851 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.682686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5krg\" (UniqueName: \"kubernetes.io/projected/b81d669e-69b2-4121-bfaf-bacf7bb56538-kube-api-access-h5krg\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.682851 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.682720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.682851 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.682737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.682851 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.682769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b81d669e-69b2-4121-bfaf-bacf7bb56538-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.682851 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.682821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.683092 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.683054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-home\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.683235 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.683212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.683325 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.683299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.685924 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.685895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.685924 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.685915 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b81d669e-69b2-4121-bfaf-bacf7bb56538-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.691514 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.691480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5krg\" (UniqueName: \"kubernetes.io/projected/b81d669e-69b2-4121-bfaf-bacf7bb56538-kube-api-access-h5krg\") pod \"router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.854956 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.854879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:15.994726 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:15.994698 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt"] Apr 17 16:56:15.996570 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:56:15.996532 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb81d669e_69b2_4121_bfaf_bacf7bb56538.slice/crio-856e41d122f4451a0a03727eb228be74d89aaea975e9a68ab73e3e4ab7002f84 WatchSource:0}: Error finding container 856e41d122f4451a0a03727eb228be74d89aaea975e9a68ab73e3e4ab7002f84: Status 404 returned error can't find the container with id 856e41d122f4451a0a03727eb228be74d89aaea975e9a68ab73e3e4ab7002f84 Apr 17 16:56:16.505736 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:16.505694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" event={"ID":"b81d669e-69b2-4121-bfaf-bacf7bb56538","Type":"ContainerStarted","Data":"f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571"} Apr 17 16:56:16.505736 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:16.505740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" event={"ID":"b81d669e-69b2-4121-bfaf-bacf7bb56538","Type":"ContainerStarted","Data":"856e41d122f4451a0a03727eb228be74d89aaea975e9a68ab73e3e4ab7002f84"} Apr 17 16:56:20.522254 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:20.522222 2576 generic.go:358] "Generic (PLEG): container finished" podID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerID="f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571" exitCode=0 Apr 17 16:56:20.522639 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:20.522283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" event={"ID":"b81d669e-69b2-4121-bfaf-bacf7bb56538","Type":"ContainerDied","Data":"f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571"} Apr 17 16:56:20.523481 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:20.523464 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:56:21.527134 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:21.527051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" event={"ID":"b81d669e-69b2-4121-bfaf-bacf7bb56538","Type":"ContainerStarted","Data":"b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2"} Apr 17 16:56:21.548097 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:21.548042 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podStartSLOduration=6.548026785 podStartE2EDuration="6.548026785s" podCreationTimestamp="2026-04-17 16:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:56:21.547145429 +0000 UTC m=+1501.012328301" watchObservedRunningTime="2026-04-17 16:56:21.548026785 +0000 UTC m=+1501.013209646" Apr 17 16:56:22.283981 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:22.283945 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:56:22.285031 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:22.285008 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:56:25.855838 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:25.855808 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:25.856369 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:25.855851 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:56:25.857304 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:25.857276 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 17 16:56:35.855276 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:35.855185 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 17 16:56:37.394272 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.394248 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:56:37.499909 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.499817 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-home\") pod \"5e2d9378-474e-490a-92bb-3329c5ab7825\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " Apr 17 16:56:37.499909 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.499906 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-dshm\") pod \"5e2d9378-474e-490a-92bb-3329c5ab7825\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " Apr 17 16:56:37.500150 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.499941 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c27rk\" (UniqueName: \"kubernetes.io/projected/5e2d9378-474e-490a-92bb-3329c5ab7825-kube-api-access-c27rk\") pod \"5e2d9378-474e-490a-92bb-3329c5ab7825\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " Apr 17 16:56:37.500150 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.500031 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d9378-474e-490a-92bb-3329c5ab7825-tls-certs\") pod \"5e2d9378-474e-490a-92bb-3329c5ab7825\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " Apr 17 16:56:37.500150 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.500057 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-model-cache\") pod \"5e2d9378-474e-490a-92bb-3329c5ab7825\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " Apr 17 16:56:37.500150 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.500090 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-kserve-provision-location\") pod \"5e2d9378-474e-490a-92bb-3329c5ab7825\" (UID: \"5e2d9378-474e-490a-92bb-3329c5ab7825\") " Apr 17 16:56:37.500367 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.500337 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-model-cache" (OuterVolumeSpecName: "model-cache") pod "5e2d9378-474e-490a-92bb-3329c5ab7825" (UID: "5e2d9378-474e-490a-92bb-3329c5ab7825"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:37.500524 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.500458 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-model-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.500524 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.500465 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-home" (OuterVolumeSpecName: "home") pod "5e2d9378-474e-490a-92bb-3329c5ab7825" (UID: "5e2d9378-474e-490a-92bb-3329c5ab7825"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:37.502101 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.502068 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e2d9378-474e-490a-92bb-3329c5ab7825-kube-api-access-c27rk" (OuterVolumeSpecName: "kube-api-access-c27rk") pod "5e2d9378-474e-490a-92bb-3329c5ab7825" (UID: "5e2d9378-474e-490a-92bb-3329c5ab7825"). InnerVolumeSpecName "kube-api-access-c27rk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:56:37.502551 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.502528 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-dshm" (OuterVolumeSpecName: "dshm") pod "5e2d9378-474e-490a-92bb-3329c5ab7825" (UID: "5e2d9378-474e-490a-92bb-3329c5ab7825"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:37.502628 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.502527 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e2d9378-474e-490a-92bb-3329c5ab7825-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5e2d9378-474e-490a-92bb-3329c5ab7825" (UID: "5e2d9378-474e-490a-92bb-3329c5ab7825"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:56:37.564879 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.564832 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e2d9378-474e-490a-92bb-3329c5ab7825" (UID: "5e2d9378-474e-490a-92bb-3329c5ab7825"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:37.590645 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.590611 2576 generic.go:358] "Generic (PLEG): container finished" podID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerID="43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b" exitCode=137 Apr 17 16:56:37.590819 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.590706 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" Apr 17 16:56:37.590819 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.590704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" event={"ID":"5e2d9378-474e-490a-92bb-3329c5ab7825","Type":"ContainerDied","Data":"43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b"} Apr 17 16:56:37.590891 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.590817 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz" event={"ID":"5e2d9378-474e-490a-92bb-3329c5ab7825","Type":"ContainerDied","Data":"8d22ccaa43684291275ccf0f23dd02b420aef4b75a2df2537d70e82593a7ba44"} Apr 17 16:56:37.590891 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.590838 2576 scope.go:117] "RemoveContainer" containerID="43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b" Apr 17 16:56:37.601273 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.601143 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-dshm\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.601273 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.601181 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c27rk\" (UniqueName: \"kubernetes.io/projected/5e2d9378-474e-490a-92bb-3329c5ab7825-kube-api-access-c27rk\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.601273 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.601199 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2d9378-474e-490a-92bb-3329c5ab7825-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.601273 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.601211 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.601273 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.601220 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5e2d9378-474e-490a-92bb-3329c5ab7825-home\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:56:37.610435 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.610414 2576 scope.go:117] "RemoveContainer" containerID="a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744" Apr 17 16:56:37.621144 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.621111 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz"] Apr 17 16:56:37.621788 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.621765 2576 scope.go:117] "RemoveContainer" containerID="43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b" Apr 17 16:56:37.622105 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:56:37.622084 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b\": container with ID starting with 43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b not found: ID does not exist" containerID="43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b" Apr 17 16:56:37.622191 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.622115 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b"} err="failed to get container status \"43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b\": rpc error: code = NotFound desc = could not find container \"43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b\": container with ID starting with 43f7431a486357239ea98d35607ba16d3eb3660d2ca47d4e936bfc0ec311563b not found: ID does not exist" Apr 17 16:56:37.622191 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.622136 2576 scope.go:117] "RemoveContainer" containerID="a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744" Apr 17 16:56:37.622643 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:56:37.622573 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744\": container with ID starting with a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744 not found: ID does not exist" containerID="a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744" Apr 17 16:56:37.622643 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.622607 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744"} err="failed to get container status \"a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744\": rpc error: code = NotFound desc = could not find container \"a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744\": container with ID starting with a590326704ade5d715a3c0daa3e8ceb820741bc0f57239be325b1bb94d104744 not found: ID does not exist" Apr 17 16:56:37.625337 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:37.625314 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-87c596fbb-pllnz"] Apr 17 16:56:39.143490 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:39.143449 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" path="/var/lib/kubelet/pods/5e2d9378-474e-490a-92bb-3329c5ab7825/volumes" Apr 17 16:56:45.855351 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:45.855307 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 17 16:56:55.855574 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:56:55.855533 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 17 16:57:05.855654 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:57:05.855606 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 17 16:57:15.855472 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:57:15.855426 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 17 16:57:25.855806 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:57:25.855762 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 17 16:57:35.855273 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:57:35.855232 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 17 16:57:45.855395 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:57:45.855346 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 17 16:57:55.865733 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:57:55.865700 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:57:55.873668 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:57:55.873641 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:58:07.213312 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:07.213235 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt"] Apr 17 16:58:07.213702 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:07.213531 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" containerID="cri-o://b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2" gracePeriod=30 Apr 17 16:58:22.369705 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:22.369679 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:22.400378 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:22.400338 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:22.503918 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:22.503892 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:22.516987 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:22.516962 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:23.440010 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:23.439980 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:23.458767 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:23.458734 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:23.527627 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:23.527601 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:23.536882 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:23.536855 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:24.496566 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:24.496529 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:24.513371 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:24.513342 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:24.575709 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:24.575679 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:24.587387 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:24.587361 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:25.545788 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:25.545762 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:25.560758 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:25.560682 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:25.627081 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:25.627023 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:25.635841 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:25.635804 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:26.552414 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:26.552385 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:26.568281 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:26.568253 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:26.629411 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:26.629380 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:26.637731 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:26.637705 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:27.595463 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:27.595430 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:27.612993 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:27.612964 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:27.675515 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:27.675481 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:27.684817 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:27.684795 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:28.618583 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:28.618516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:28.635298 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:28.635271 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:28.700455 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:28.700425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:28.710539 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:28.710518 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:29.628616 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:29.628582 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:29.644470 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:29.644442 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:29.707608 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:29.707581 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:29.723620 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:29.723597 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:30.641583 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:30.641551 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:30.657908 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:30.657876 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:30.722906 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:30.722877 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:30.739299 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:30.739272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:31.674326 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:31.674289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:31.690098 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:31.690072 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:31.759776 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:31.759740 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:31.770969 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:31.770942 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:32.708626 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:32.708594 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:32.726476 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:32.726433 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:32.806390 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:32.806363 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:32.817060 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:32.817030 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:33.818372 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:33.818339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:33.838062 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:33.838024 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:33.951922 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:33.951877 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:34.001779 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:34.001731 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:34.977654 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:34.977624 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:34.996089 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:34.996059 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:35.067445 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:35.067418 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:35.077179 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:35.077159 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:36.014771 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:36.014741 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-lp2s8_faf927d0-b1ed-4cf1-87fe-cefd8c511440/istio-proxy/0.log" Apr 17 16:58:36.029654 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:36.029629 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-mvtwc_307ec0af-e668-4251-b9fd-8ffc80f9a0bd/istio-proxy/0.log" Apr 17 16:58:36.096042 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:36.096012 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/main/0.log" Apr 17 16:58:36.106234 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:36.106210 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt_b81d669e-69b2-4121-bfaf-bacf7bb56538/storage-initializer/0.log" Apr 17 16:58:37.099392 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.099360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-rzkt7_0f143604-a24b-451b-8dff-83c446e9941f/istio-proxy/0.log" Apr 17 16:58:37.120513 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.120473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7bf9cdf774-5ntqx_0e0df564-3d93-4f67-a3bd-4157590025f5/router/0.log" Apr 17 16:58:37.484708 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.484686 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:58:37.575974 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.575937 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5krg\" (UniqueName: \"kubernetes.io/projected/b81d669e-69b2-4121-bfaf-bacf7bb56538-kube-api-access-h5krg\") pod \"b81d669e-69b2-4121-bfaf-bacf7bb56538\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " Apr 17 16:58:37.575974 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.575977 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b81d669e-69b2-4121-bfaf-bacf7bb56538-tls-certs\") pod \"b81d669e-69b2-4121-bfaf-bacf7bb56538\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " Apr 17 16:58:37.576226 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.576005 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-dshm\") pod \"b81d669e-69b2-4121-bfaf-bacf7bb56538\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " Apr 17 16:58:37.576226 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.576055 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-model-cache\") pod \"b81d669e-69b2-4121-bfaf-bacf7bb56538\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " Apr 17 16:58:37.576226 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.576085 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-kserve-provision-location\") pod \"b81d669e-69b2-4121-bfaf-bacf7bb56538\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " Apr 17 16:58:37.576226 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.576121 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-home\") pod \"b81d669e-69b2-4121-bfaf-bacf7bb56538\" (UID: \"b81d669e-69b2-4121-bfaf-bacf7bb56538\") " Apr 17 16:58:37.576756 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.576703 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-home" (OuterVolumeSpecName: "home") pod "b81d669e-69b2-4121-bfaf-bacf7bb56538" (UID: "b81d669e-69b2-4121-bfaf-bacf7bb56538"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:37.576756 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.576717 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-model-cache" (OuterVolumeSpecName: "model-cache") pod "b81d669e-69b2-4121-bfaf-bacf7bb56538" (UID: "b81d669e-69b2-4121-bfaf-bacf7bb56538"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:37.578220 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.578188 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-dshm" (OuterVolumeSpecName: "dshm") pod "b81d669e-69b2-4121-bfaf-bacf7bb56538" (UID: "b81d669e-69b2-4121-bfaf-bacf7bb56538"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:37.578329 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.578239 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81d669e-69b2-4121-bfaf-bacf7bb56538-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b81d669e-69b2-4121-bfaf-bacf7bb56538" (UID: "b81d669e-69b2-4121-bfaf-bacf7bb56538"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:58:37.578329 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.578240 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81d669e-69b2-4121-bfaf-bacf7bb56538-kube-api-access-h5krg" (OuterVolumeSpecName: "kube-api-access-h5krg") pod "b81d669e-69b2-4121-bfaf-bacf7bb56538" (UID: "b81d669e-69b2-4121-bfaf-bacf7bb56538"). InnerVolumeSpecName "kube-api-access-h5krg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:58:37.642393 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.642328 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b81d669e-69b2-4121-bfaf-bacf7bb56538" (UID: "b81d669e-69b2-4121-bfaf-bacf7bb56538"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:37.676955 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.676904 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-home\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.676955 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.676947 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h5krg\" (UniqueName: \"kubernetes.io/projected/b81d669e-69b2-4121-bfaf-bacf7bb56538-kube-api-access-h5krg\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.676955 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.676963 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b81d669e-69b2-4121-bfaf-bacf7bb56538-tls-certs\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.677211 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.676974 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-dshm\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.677211 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.676986 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-model-cache\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.677211 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.676997 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b81d669e-69b2-4121-bfaf-bacf7bb56538-kserve-provision-location\") on node \"ip-10-0-134-142.ec2.internal\" DevicePath \"\"" Apr 17 16:58:37.946738 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.946653 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-rzkt7_0f143604-a24b-451b-8dff-83c446e9941f/istio-proxy/0.log" Apr 17 16:58:37.963198 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:37.963162 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7bf9cdf774-5ntqx_0e0df564-3d93-4f67-a3bd-4157590025f5/router/0.log" Apr 17 16:58:38.012971 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.012935 2576 generic.go:358] "Generic (PLEG): container finished" podID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerID="b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2" exitCode=137 Apr 17 16:58:38.013139 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.013011 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" Apr 17 16:58:38.013139 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.013020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" event={"ID":"b81d669e-69b2-4121-bfaf-bacf7bb56538","Type":"ContainerDied","Data":"b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2"} Apr 17 16:58:38.013139 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.013061 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt" event={"ID":"b81d669e-69b2-4121-bfaf-bacf7bb56538","Type":"ContainerDied","Data":"856e41d122f4451a0a03727eb228be74d89aaea975e9a68ab73e3e4ab7002f84"} Apr 17 16:58:38.013139 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.013082 2576 scope.go:117] "RemoveContainer" containerID="b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2" Apr 17 16:58:38.033914 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.033894 2576 scope.go:117] "RemoveContainer" containerID="f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571" Apr 17 16:58:38.042025 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.042000 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt"] Apr 17 16:58:38.044547 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.044531 2576 scope.go:117] "RemoveContainer" containerID="b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2" Apr 17 16:58:38.044863 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:58:38.044839 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2\": container with ID starting with b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2 not found: ID does not exist" containerID="b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2" Apr 17 16:58:38.044939 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.044877 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2"} err="failed to get container status \"b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2\": rpc error: code = NotFound desc = could not find container \"b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2\": container with ID starting with b313c83a1dd1889b7a07f52f08f56c76b4d2b3f16297b85ac170880a487406c2 not found: ID does not exist" Apr 17 16:58:38.044939 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.044900 2576 scope.go:117] "RemoveContainer" containerID="f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571" Apr 17 16:58:38.045168 ip-10-0-134-142 kubenswrapper[2576]: E0417 16:58:38.045152 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571\": container with ID starting with f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571 not found: ID does not exist" containerID="f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571" Apr 17 16:58:38.045222 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.045176 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571"} err="failed to get container status \"f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571\": rpc error: code = NotFound desc = could not find container \"f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571\": container with ID starting with f05f8edf9362bcd306f1bcabc667ba321ca41a3f1b824e0e4e1ed2ef4ba2a571 not found: ID does not exist" Apr 17 16:58:38.051942 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.051914 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-84b99cbd86-khbzt"] Apr 17 16:58:38.836909 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:38.836865 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-wzmd5_54d5dc22-bd5b-4ebf-a65b-b25604ea654b/manager/0.log" Apr 17 16:58:39.143271 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:39.143177 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" path="/var/lib/kubelet/pods/b81d669e-69b2-4121-bfaf-bacf7bb56538/volumes" Apr 17 16:58:43.948266 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:43.948229 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-snx5l_ecab24eb-d60d-4495-8c68-2f114d0a30f9/global-pull-secret-syncer/0.log" Apr 17 16:58:44.053080 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:44.053047 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rqqd6_72c7f3b9-a592-4131-9cee-a95f18f9eaa3/konnectivity-agent/0.log" Apr 17 16:58:44.131538 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:44.131489 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-142.ec2.internal_0ee3f40a240ec9a107cf332537877284/haproxy/0.log" Apr 17 16:58:48.205172 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:48.205136 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-wzmd5_54d5dc22-bd5b-4ebf-a65b-b25604ea654b/manager/0.log" Apr 17 16:58:49.512961 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.512930 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tkskl_bb8e36fe-6c83-40c6-8e4f-7a439d8fee12/kube-state-metrics/0.log" Apr 17 16:58:49.535775 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.535743 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tkskl_bb8e36fe-6c83-40c6-8e4f-7a439d8fee12/kube-rbac-proxy-main/0.log" Apr 17 16:58:49.563553 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.563479 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tkskl_bb8e36fe-6c83-40c6-8e4f-7a439d8fee12/kube-rbac-proxy-self/0.log" Apr 17 16:58:49.610150 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.610113 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7c9859c58c-vspmm_d7174470-3052-4a86-bf2e-7b684aca84fe/metrics-server/0.log" Apr 17 16:58:49.644911 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.644886 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-bj7s7_ba63d4bf-a505-45d2-8a8b-b6f74ad23868/monitoring-plugin/0.log" Apr 17 16:58:49.842001 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.841931 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-t7l69_3021cc06-14c0-4d00-b02c-90d9eaecb08f/node-exporter/0.log" Apr 17 16:58:49.866096 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.866070 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-t7l69_3021cc06-14c0-4d00-b02c-90d9eaecb08f/kube-rbac-proxy/0.log" Apr 17 16:58:49.888313 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.888284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-t7l69_3021cc06-14c0-4d00-b02c-90d9eaecb08f/init-textfile/0.log" Apr 17 16:58:49.925247 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.925217 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-ztfmg_1c0c6bd4-7556-4400-b3cb-97a101645604/kube-rbac-proxy-main/0.log" Apr 17 16:58:49.958518 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.958470 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-ztfmg_1c0c6bd4-7556-4400-b3cb-97a101645604/kube-rbac-proxy-self/0.log" Apr 17 16:58:49.989392 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:49.989365 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-ztfmg_1c0c6bd4-7556-4400-b3cb-97a101645604/openshift-state-metrics/0.log" Apr 17 16:58:50.032435 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.032401 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a6bdf597-226a-42e5-a554-67152d8c8f03/prometheus/0.log" Apr 17 16:58:50.051935 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.051907 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a6bdf597-226a-42e5-a554-67152d8c8f03/config-reloader/0.log" Apr 17 16:58:50.077090 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.077060 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a6bdf597-226a-42e5-a554-67152d8c8f03/thanos-sidecar/0.log" Apr 17 16:58:50.096994 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.096927 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a6bdf597-226a-42e5-a554-67152d8c8f03/kube-rbac-proxy-web/0.log" Apr 17 16:58:50.127145 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.127117 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a6bdf597-226a-42e5-a554-67152d8c8f03/kube-rbac-proxy/0.log" Apr 17 16:58:50.149669 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.149642 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a6bdf597-226a-42e5-a554-67152d8c8f03/kube-rbac-proxy-thanos/0.log" Apr 17 16:58:50.178868 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.178844 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a6bdf597-226a-42e5-a554-67152d8c8f03/init-config-reloader/0.log" Apr 17 16:58:50.292118 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.292089 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-84kts_7087f126-5e4b-4d2e-846b-cdbfab7f37b6/prometheus-operator-admission-webhook/0.log" Apr 17 16:58:50.430543 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.430520 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7d7d8b6597-4hx86_75373c0d-030c-4f0d-beeb-fc38b539c454/thanos-query/0.log" Apr 17 16:58:50.461023 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.460986 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7d7d8b6597-4hx86_75373c0d-030c-4f0d-beeb-fc38b539c454/kube-rbac-proxy-web/0.log" Apr 17 16:58:50.484733 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.484700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7d7d8b6597-4hx86_75373c0d-030c-4f0d-beeb-fc38b539c454/kube-rbac-proxy/0.log" Apr 17 16:58:50.508460 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.508430 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7d7d8b6597-4hx86_75373c0d-030c-4f0d-beeb-fc38b539c454/prom-label-proxy/0.log" Apr 17 16:58:50.559604 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.559574 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7d7d8b6597-4hx86_75373c0d-030c-4f0d-beeb-fc38b539c454/kube-rbac-proxy-rules/0.log" Apr 17 16:58:50.584712 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:50.584684 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7d7d8b6597-4hx86_75373c0d-030c-4f0d-beeb-fc38b539c454/kube-rbac-proxy-metrics/0.log" Apr 17 16:58:51.741370 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:51.741329 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-s2pgl_a1196930-a0cd-4cbf-9129-12a1bd8f7941/networking-console-plugin/0.log" Apr 17 16:58:52.680051 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680014 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22"] Apr 17 16:58:52.680427 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680415 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="storage-initializer" Apr 17 16:58:52.680473 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680429 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="storage-initializer" Apr 17 16:58:52.680473 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680445 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" Apr 17 16:58:52.680473 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680452 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" Apr 17 16:58:52.680473 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680460 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="storage-initializer" Apr 17 16:58:52.680473 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680465 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="storage-initializer" Apr 17 16:58:52.680473 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680472 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" Apr 17 16:58:52.680693 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680477 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" Apr 17 16:58:52.680693 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680545 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b81d669e-69b2-4121-bfaf-bacf7bb56538" containerName="main" Apr 17 16:58:52.680693 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.680555 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e2d9378-474e-490a-92bb-3329c5ab7825" containerName="main" Apr 17 16:58:52.683823 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.683796 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.686238 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.686214 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2whr7\"/\"default-dockercfg-8vtfj\"" Apr 17 16:58:52.686394 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.686377 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2whr7\"/\"openshift-service-ca.crt\"" Apr 17 16:58:52.687214 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.687200 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2whr7\"/\"kube-root-ca.crt\"" Apr 17 16:58:52.692417 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.692393 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22"] Apr 17 16:58:52.715696 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.715665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-podres\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.715866 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.715704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-lib-modules\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.715866 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.715838 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-proc\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.716009 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.715907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-sys\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.716009 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.715962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c4mt\" (UniqueName: \"kubernetes.io/projected/f36041c9-58ea-47c8-aac7-10bdd63e6e47-kube-api-access-6c4mt\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.816744 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.816703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c4mt\" (UniqueName: \"kubernetes.io/projected/f36041c9-58ea-47c8-aac7-10bdd63e6e47-kube-api-access-6c4mt\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.817195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.816758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-podres\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.817195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.816785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-lib-modules\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.817195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.816832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-proc\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.817195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.816862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-sys\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.817195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.816893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-podres\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.817195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.816922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-sys\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.817195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.816942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-proc\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.817195 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.816975 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f36041c9-58ea-47c8-aac7-10bdd63e6e47-lib-modules\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.826082 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.826051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c4mt\" (UniqueName: \"kubernetes.io/projected/f36041c9-58ea-47c8-aac7-10bdd63e6e47-kube-api-access-6c4mt\") pod \"perf-node-gather-daemonset-tsm22\" (UID: \"f36041c9-58ea-47c8-aac7-10bdd63e6e47\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:52.995702 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:52.995599 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:53.123188 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:53.123134 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22"] Apr 17 16:58:53.126027 ip-10-0-134-142 kubenswrapper[2576]: W0417 16:58:53.125993 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf36041c9_58ea_47c8_aac7_10bdd63e6e47.slice/crio-aa4e2388f343faf115c04e65bcb2d0ffb4a062f29e0762d2021abaa125090810 WatchSource:0}: Error finding container aa4e2388f343faf115c04e65bcb2d0ffb4a062f29e0762d2021abaa125090810: Status 404 returned error can't find the container with id aa4e2388f343faf115c04e65bcb2d0ffb4a062f29e0762d2021abaa125090810 Apr 17 16:58:54.068354 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:54.068315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" event={"ID":"f36041c9-58ea-47c8-aac7-10bdd63e6e47","Type":"ContainerStarted","Data":"01ea70bfc26105d8dd953d01b68ae891ef03e66f3885e9465aec32a162db391c"} Apr 17 16:58:54.068354 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:54.068357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" event={"ID":"f36041c9-58ea-47c8-aac7-10bdd63e6e47","Type":"ContainerStarted","Data":"aa4e2388f343faf115c04e65bcb2d0ffb4a062f29e0762d2021abaa125090810"} Apr 17 16:58:54.068857 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:54.068388 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:58:54.086339 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:54.086309 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9vg62_13a6aa7e-beeb-4695-9196-0c33a34de454/dns/0.log" Apr 17 16:58:54.087197 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:54.087157 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" podStartSLOduration=2.087144717 podStartE2EDuration="2.087144717s" podCreationTimestamp="2026-04-17 16:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:58:54.084963936 +0000 UTC m=+1653.550146799" watchObservedRunningTime="2026-04-17 16:58:54.087144717 +0000 UTC m=+1653.552327578" Apr 17 16:58:54.118163 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:54.118133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9vg62_13a6aa7e-beeb-4695-9196-0c33a34de454/kube-rbac-proxy/0.log" Apr 17 16:58:54.270815 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:54.270791 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2wn98_03ed97e0-9a98-4b06-9c36-752ba631bdc9/dns-node-resolver/0.log" Apr 17 16:58:54.810277 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:54.810243 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-d78769859-djn2h_d1f4421a-5f1c-468c-9392-a80b17d865af/registry/0.log" Apr 17 16:58:54.831300 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:54.831273 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8mrzt_4f000c2d-d989-4f8a-8182-b5104c3a3de6/node-ca/0.log" Apr 17 16:58:55.763286 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:55.763257 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-rzkt7_0f143604-a24b-451b-8dff-83c446e9941f/istio-proxy/0.log" Apr 17 16:58:55.789276 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:55.789250 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7bf9cdf774-5ntqx_0e0df564-3d93-4f67-a3bd-4157590025f5/router/0.log" Apr 17 16:58:56.312330 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:56.312303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k5nlz_deee814d-1106-4662-bd60-3fc176424f4c/serve-healthcheck-canary/0.log" Apr 17 16:58:56.928941 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:56.928912 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qqp2w_3890b08f-1a41-43ca-8a22-786c4ea642f2/kube-rbac-proxy/0.log" Apr 17 16:58:56.950966 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:56.950939 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qqp2w_3890b08f-1a41-43ca-8a22-786c4ea642f2/exporter/0.log" Apr 17 16:58:56.975525 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:56.975483 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qqp2w_3890b08f-1a41-43ca-8a22-786c4ea642f2/extractor/0.log" Apr 17 16:58:59.551768 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:58:59.551739 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5f68f6fcb9-kvwss_fd251306-7990-42ff-a8aa-3d288f473b17/manager/0.log" Apr 17 16:59:00.082328 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:00.082299 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-tsm22" Apr 17 16:59:00.194042 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:00.194010 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-85bb65f8c4-pxv2p_1e07626f-ef69-4fac-934a-f33712143ec4/manager/0.log" Apr 17 16:59:00.458321 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:00.458289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-jgrx2_f9d8c8af-6615-4eb8-9536-0e74a266f1f1/manager/0.log" Apr 17 16:59:00.508374 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:00.508339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-fgpbh_1b018a95-ab05-4f83-af69-a729e521004f/seaweedfs/0.log" Apr 17 16:59:05.640569 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:05.640527 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-ctq4m_90c4edf8-ec15-4ea5-b1c6-ede355520321/kube-storage-version-migrator-operator/1.log" Apr 17 16:59:05.641429 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:05.641414 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-ctq4m_90c4edf8-ec15-4ea5-b1c6-ede355520321/kube-storage-version-migrator-operator/0.log" Apr 17 16:59:06.740642 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:06.740602 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-52zsq_bd3229a9-ef02-4e0e-aa80-b65af9672fee/kube-multus/0.log" Apr 17 16:59:06.943602 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:06.943573 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jkkg4_72324db7-6720-4fe7-a0fc-23dee216601e/kube-multus-additional-cni-plugins/0.log" Apr 17 16:59:06.966693 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:06.966666 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jkkg4_72324db7-6720-4fe7-a0fc-23dee216601e/egress-router-binary-copy/0.log" Apr 17 16:59:06.990122 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:06.990094 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jkkg4_72324db7-6720-4fe7-a0fc-23dee216601e/cni-plugins/0.log" Apr 17 16:59:07.012998 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:07.012919 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jkkg4_72324db7-6720-4fe7-a0fc-23dee216601e/bond-cni-plugin/0.log" Apr 17 16:59:07.038919 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:07.038891 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jkkg4_72324db7-6720-4fe7-a0fc-23dee216601e/routeoverride-cni/0.log" Apr 17 16:59:07.062557 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:07.062491 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jkkg4_72324db7-6720-4fe7-a0fc-23dee216601e/whereabouts-cni-bincopy/0.log" Apr 17 16:59:07.085479 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:07.085451 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jkkg4_72324db7-6720-4fe7-a0fc-23dee216601e/whereabouts-cni/0.log" Apr 17 16:59:07.362874 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:07.362795 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jn9vc_381620fe-22c1-43ed-99de-f61c1b4886ca/network-metrics-daemon/0.log" Apr 17 16:59:07.397180 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:07.397154 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jn9vc_381620fe-22c1-43ed-99de-f61c1b4886ca/kube-rbac-proxy/0.log" Apr 17 16:59:08.573170 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:08.573143 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-controller/0.log" Apr 17 16:59:08.590992 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:08.590968 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/0.log" Apr 17 16:59:08.598510 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:08.598480 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovn-acl-logging/1.log" Apr 17 16:59:08.618686 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:08.618659 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/kube-rbac-proxy-node/0.log" Apr 17 16:59:08.646766 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:08.646740 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 16:59:08.668973 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:08.668946 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/northd/0.log" Apr 17 16:59:08.691491 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:08.691467 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/nbdb/0.log" Apr 17 16:59:08.717683 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:08.717658 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/sbdb/0.log" Apr 17 16:59:08.819015 ip-10-0-134-142 kubenswrapper[2576]: I0417 16:59:08.818986 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-twl5m_21d6e76d-ca83-45a0-b330-7eced4280d5f/ovnkube-controller/0.log"