Apr 23 16:35:16.796976 ip-10-0-136-27 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:35:17.246508 ip-10-0-136-27 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:17.246508 ip-10-0-136-27 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:35:17.246508 ip-10-0-136-27 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:17.246508 ip-10-0-136-27 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:35:17.246508 ip-10-0-136-27 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:17.248933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.248851 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:35:17.253409 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253394 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:17.253409 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253410 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253414 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253417 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253420 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253424 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253427 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253430 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253433 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253436 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253439 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253442 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253445 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253448 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253452 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253463 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253466 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253469 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253472 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253475 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253477 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:17.253484 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253480 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253483 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253486 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253489 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253491 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253494 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253497 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253499 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253502 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253505 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253507 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253510 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253512 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253515 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253517 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253520 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253522 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253524 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253527 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253529 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:17.253969 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253532 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253534 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253536 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253539 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253542 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253545 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253548 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253550 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253553 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253556 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253559 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253562 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253565 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253567 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253570 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253573 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253575 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253578 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253581 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253583 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:17.254463 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253585 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253588 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253591 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253593 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253597 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253601 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253604 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253606 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253609 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253611 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253614 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253616 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253619 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253621 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253624 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253627 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253629 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253633 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253635 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:17.255045 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253638 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253640 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253643 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253646 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253648 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.253651 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254235 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254242 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254245 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254248 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254251 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254254 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254268 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254271 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254274 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254276 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254279 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254281 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254284 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254287 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:17.255517 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254289 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254292 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254294 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254297 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254299 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254302 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254305 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254308 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254310 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254313 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254315 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254318 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254320 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254323 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254326 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254329 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254332 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254334 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254337 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:17.256018 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254340 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254343 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254346 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254348 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254351 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254353 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254356 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254358 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254360 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254364 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254367 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254370 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254372 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254375 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254377 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254379 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254382 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254384 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254386 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254389 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:17.256520 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254392 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254395 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254399 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254403 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254406 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254409 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254412 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254415 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254417 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254420 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254423 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254425 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254428 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254430 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254433 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254435 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254438 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254440 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254443 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:17.257030 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254445 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254447 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254450 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254452 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254455 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254458 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254460 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254463 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254465 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254468 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254471 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254473 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254476 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.254478 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255168 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255185 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255191 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255196 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255201 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255204 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255209 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255214 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:35:17.257499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255217 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255220 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255224 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255227 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255230 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255233 2571 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255236 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255239 2571 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255242 2571 flags.go:64] FLAG: --cloud-config="" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255245 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255248 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255253 2571 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255256 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255259 2571 flags.go:64] FLAG: --config-dir="" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255262 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255265 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255269 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255272 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255275 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255278 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255281 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255284 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255287 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255290 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255293 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:35:17.258046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255298 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255301 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255304 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255306 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255309 2571 flags.go:64] FLAG: --enable-server="true" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255312 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255317 2571 flags.go:64] FLAG: --event-burst="100" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255320 2571 flags.go:64] FLAG: --event-qps="50" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255323 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255326 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255329 2571 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255332 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255335 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255338 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255341 2571 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255344 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255347 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255350 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255353 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255356 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255359 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255361 2571 flags.go:64] FLAG: --feature-gates="" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255365 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255368 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255371 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:35:17.258646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255374 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255377 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255380 2571 flags.go:64] FLAG: --help="false" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255383 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255386 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255391 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255394 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255398 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255401 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255404 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255407 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255410 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255413 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255416 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255419 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255422 2571 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255425 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255427 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255430 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255433 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255436 2571 flags.go:64] FLAG: --lock-file="" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255439 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255441 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255444 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:35:17.259261 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255450 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255453 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255456 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255459 2571 flags.go:64] FLAG: --logging-format="text" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255461 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255464 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255467 2571 flags.go:64] FLAG: --manifest-url="" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255470 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255475 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255478 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255482 2571 flags.go:64] FLAG: --max-pods="110" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255485 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255488 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255492 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255495 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255498 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255501 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255504 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255513 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255516 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255521 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255524 2571 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255527 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:35:17.259875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255532 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255535 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255538 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255541 2571 flags.go:64] FLAG: --port="10250" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255544 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255547 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0000d1e386513f0e7" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255550 2571 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255552 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255555 2571 flags.go:64] FLAG: --register-node="true" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255558 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255561 2571 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255564 2571 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255567 2571 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255570 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255572 2571 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255576 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255579 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255582 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255585 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255588 2571 flags.go:64] FLAG: --runonce="false" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255591 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255594 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255601 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255604 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255607 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255610 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:35:17.260440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255613 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255616 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255619 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255622 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255626 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255629 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255632 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255635 2571 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255637 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255643 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255645 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255648 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255653 2571 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255655 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255658 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255661 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255664 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255667 2571 flags.go:64] FLAG: --v="2" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255671 2571 flags.go:64] FLAG: --version="false" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255675 2571 flags.go:64] FLAG: --vmodule="" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255679 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.255683 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255802 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255806 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:17.261102 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255808 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255811 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255814 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255817 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255821 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255824 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255827 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255829 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255832 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255834 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255841 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255843 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255848 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255850 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255853 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255855 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255858 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255860 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255862 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255865 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:17.261746 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255867 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255870 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255872 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255874 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255877 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255879 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255881 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255884 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255887 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255889 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255891 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255894 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255896 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255899 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255901 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255903 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255906 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255909 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255911 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255914 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:17.262306 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255916 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255919 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255921 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255929 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255933 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255935 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255938 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255941 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255944 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255946 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255949 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255952 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255954 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255956 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255959 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255961 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255964 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255966 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255968 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255971 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:17.262887 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255973 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255976 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255978 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255981 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255984 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255986 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255990 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255993 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255997 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.255999 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256002 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256004 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256006 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256014 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256017 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256019 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256028 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256031 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256034 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:17.263431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256038 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256041 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256044 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256047 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.256050 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.256785 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.263216 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.263313 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263361 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263367 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263370 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263373 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263376 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263379 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263382 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263385 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:17.263979 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263387 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263390 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263393 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263396 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263398 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263400 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263403 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263406 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263409 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263411 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263414 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263416 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263419 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263422 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263424 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263427 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263430 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263432 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263435 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:17.264431 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263437 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263440 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263442 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263445 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263450 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263453 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263455 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263458 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263461 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263464 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263466 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263469 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263471 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263473 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263477 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263482 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263487 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263490 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263493 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:17.264968 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263496 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263499 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263502 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263504 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263507 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263509 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263512 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263514 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263517 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263519 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263522 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263524 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263527 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263529 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263532 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263535 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263537 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263540 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263544 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263547 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:17.265512 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263549 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263552 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263555 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263557 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263560 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263562 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263565 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263568 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263570 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263573 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263575 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263578 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263580 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263583 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263585 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263588 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263590 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263593 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263595 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:17.266037 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263597 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.263602 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263723 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263729 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263732 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263735 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263738 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263741 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263743 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263746 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263748 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263752 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263755 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263758 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263760 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:17.266525 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263763 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263765 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263768 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263770 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263773 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263775 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263778 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263780 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263783 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263785 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263788 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263790 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263794 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263797 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263800 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263803 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263805 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263807 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263810 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:17.266912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263812 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263814 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263817 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263819 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263822 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263824 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263826 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263829 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263831 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263834 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263837 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263840 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263842 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263845 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263847 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263850 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263852 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263854 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263857 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:17.267388 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263859 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263861 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263863 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263866 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263868 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263871 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263873 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263875 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263878 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263880 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263883 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263885 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263887 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263890 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263892 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263895 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263897 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263899 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263902 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263904 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:17.267862 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263906 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263909 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263912 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263914 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263917 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263919 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263922 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263924 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263926 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263929 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263931 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263933 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263936 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263940 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:17.263943 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:17.268355 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.263947 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:17.268782 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.264742 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:35:17.268782 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.266826 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:35:17.268782 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.268516 2571 server.go:1019] "Starting client certificate rotation" Apr 23 16:35:17.268782 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.268610 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:17.268782 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.268655 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:17.294014 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.293995 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:17.300914 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.300891 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:17.318617 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.318596 2571 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:35:17.324570 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.324552 2571 log.go:25] "Validated CRI v1 image API" Apr 23 16:35:17.325710 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.325677 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:35:17.326005 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.325988 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:17.328155 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.328138 2571 fs.go:135] Filesystem UUIDs: map[49ee40fc-dd4c-456c-ba41-a75ec777aeef:/dev/nvme0n1p4 4bbd6340-eb1d-4378-96c6-d3d924d8ee6f:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 23 16:35:17.328207 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.328156 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:35:17.333407 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.333293 2571 manager.go:217] Machine: {Timestamp:2026-04-23 16:35:17.332166076 +0000 UTC m=+0.413608637 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3095904 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20ce9e91df689f242cf3f045cffbb9 SystemUUID:ec20ce9e-91df-689f-242c-f3f045cffbb9 BootID:2cd11868-b685-48c0-9bf7-dc2945d00667 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:93:70:bb:e9:09 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:93:70:bb:e9:09 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c6:d7:b6:21:3e:01 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:35:17.333407 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.333403 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:35:17.333523 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.333511 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:35:17.335747 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.335721 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:35:17.335886 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.335750 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-27.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:35:17.336617 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.336608 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:35:17.336651 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.336620 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:35:17.336651 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.336633 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:17.336651 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.336647 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:17.337548 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.337538 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:17.337658 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.337650 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:35:17.342658 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.342648 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:35:17.342720 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.342668 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:35:17.342720 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.342682 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:35:17.342720 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.342703 2571 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:35:17.342720 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.342717 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:35:17.344465 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.344451 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:17.344510 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.344478 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:17.347823 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.347805 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:35:17.352428 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.352405 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:35:17.354854 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354839 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:35:17.354907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354860 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:35:17.354907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354866 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:35:17.354907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354872 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:35:17.354907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354878 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:35:17.354907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354884 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:35:17.354907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354889 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:35:17.354907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354894 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:35:17.354907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354901 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:35:17.354907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354907 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:35:17.355188 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354923 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:35:17.355188 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.354932 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:35:17.356666 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.356641 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-27.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:35:17.356740 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.356667 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:35:17.356740 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.356679 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:35:17.356795 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.356749 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:35:17.360534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.360520 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:35:17.360615 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.360555 2571 server.go:1295] "Started kubelet" Apr 23 16:35:17.360711 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.360649 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:35:17.360777 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.360675 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:35:17.360777 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.360742 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:35:17.361397 ip-10-0-136-27 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:35:17.362525 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.362509 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:35:17.363202 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.363168 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-q427l" Apr 23 16:35:17.364504 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.364489 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:35:17.367641 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.367624 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-27.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:35:17.368016 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.368000 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:17.368520 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.367539 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-27.ec2.internal.18a909a18899bbbf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-27.ec2.internal,UID:ip-10-0-136-27.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-27.ec2.internal,},FirstTimestamp:2026-04-23 16:35:17.360532415 +0000 UTC m=+0.441974975,LastTimestamp:2026-04-23 16:35:17.360532415 +0000 UTC m=+0.441974975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-27.ec2.internal,}" Apr 23 16:35:17.368632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.368586 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:35:17.369476 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.369453 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:17.369568 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.369539 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.371323 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.371344 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.371346 2571 factory.go:55] Registering systemd factory Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.371367 2571 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.371497 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.371506 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.371500 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.372075 2571 factory.go:153] Registering CRI-O factory Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.372092 2571 factory.go:223] Registration of the crio container factory successfully Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.372143 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.372173 2571 factory.go:103] Registering Raw factory Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.372198 2571 manager.go:1196] Started watching for new ooms in manager Apr 23 16:35:17.372796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.372603 2571 manager.go:319] Starting recovery of all containers Apr 23 16:35:17.373709 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.373670 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-q427l" Apr 23 16:35:17.375951 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.375926 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-27.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 16:35:17.376076 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.376055 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 16:35:17.382825 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.382806 2571 manager.go:324] Recovery completed Apr 23 16:35:17.387685 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.387672 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:17.390118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.390101 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:17.390218 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.390133 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:17.390218 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.390147 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:17.390741 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.390724 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:35:17.390741 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.390740 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:35:17.390832 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.390759 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:17.393521 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.393509 2571 policy_none.go:49] "None policy: Start" Apr 23 16:35:17.393578 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.393524 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:35:17.393578 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.393534 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:35:17.442253 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.442239 2571 manager.go:341] "Starting Device Plugin manager" Apr 23 16:35:17.442384 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.442295 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:35:17.442384 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.442306 2571 server.go:85] "Starting device plugin registration server" Apr 23 16:35:17.442539 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.442527 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:35:17.442589 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.442542 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:35:17.442657 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.442641 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:35:17.442752 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.442741 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:35:17.442802 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.442753 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:35:17.443383 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.443367 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:35:17.443448 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.443404 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:17.504758 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.504668 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:35:17.506003 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.505986 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:35:17.506089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.506017 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:35:17.506089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.506041 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:35:17.506089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.506052 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:35:17.506233 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.506133 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:35:17.508645 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.508622 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:17.542989 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.542961 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:17.544569 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.544554 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:17.544659 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.544599 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:17.544659 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.544615 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:17.544659 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.544644 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.553416 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.553399 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.553483 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.553420 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-27.ec2.internal\": node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:17.570482 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.570463 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:17.606489 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.606459 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal"] Apr 23 16:35:17.606559 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.606530 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:17.607897 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.607884 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:17.607943 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.607912 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:17.607943 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.607922 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:17.609042 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.609030 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:17.609184 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.609169 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.609269 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.609198 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:17.609676 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.609650 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:17.609676 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.609674 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:17.609820 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.609683 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:17.609820 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.609650 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:17.609820 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.609734 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:17.609820 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.609745 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:17.610934 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.610920 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.610985 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.610945 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:17.611587 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.611573 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:17.611682 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.611598 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:17.611682 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.611612 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:17.641281 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.641259 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-27.ec2.internal\" not found" node="ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.645679 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.645662 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-27.ec2.internal\" not found" node="ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.670594 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.670573 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:17.673005 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.672990 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6402b5e4dc46963653aa05278c9bac43-config\") pod \"kube-apiserver-proxy-ip-10-0-136-27.ec2.internal\" (UID: \"6402b5e4dc46963653aa05278c9bac43\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.673051 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.673014 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.673051 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.673031 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.771425 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.771369 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:17.773563 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.773546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.773612 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.773573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6402b5e4dc46963653aa05278c9bac43-config\") pod \"kube-apiserver-proxy-ip-10-0-136-27.ec2.internal\" (UID: \"6402b5e4dc46963653aa05278c9bac43\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.773612 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.773591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.773685 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.773620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.773685 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.773634 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b645763bd4d0264284f3e94b95b589-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal\" (UID: \"a7b645763bd4d0264284f3e94b95b589\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.773685 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.773664 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6402b5e4dc46963653aa05278c9bac43-config\") pod \"kube-apiserver-proxy-ip-10-0-136-27.ec2.internal\" (UID: \"6402b5e4dc46963653aa05278c9bac43\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.871950 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.871910 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:17.943262 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.943235 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.948953 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:17.948934 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 23 16:35:17.972662 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:17.972629 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:18.073162 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:18.073080 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:18.173677 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:18.173646 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:18.224231 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.224204 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:18.239000 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.238970 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:18.268151 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.268121 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:35:18.268809 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.268274 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:18.268809 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.268307 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:18.268809 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.268307 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:18.274418 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:18.274399 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:18.368214 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.368148 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:18.375452 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:18.375427 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:18.376519 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.376493 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:30:17 +0000 UTC" deadline="2027-10-26 23:04:23.640415088 +0000 UTC" Apr 23 16:35:18.376571 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.376519 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13230h29m5.263898777s" Apr 23 16:35:18.385511 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.385495 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:18.411957 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.411933 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8hq9j" Apr 23 16:35:18.420982 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.420959 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8hq9j" Apr 23 16:35:18.476171 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:18.476022 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:18.542999 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:18.542638 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7b645763bd4d0264284f3e94b95b589.slice/crio-aeebb200a544ddbccce8a86cf9cf845f6cf205105264f2d556eb245b0c8a8074 WatchSource:0}: Error finding container aeebb200a544ddbccce8a86cf9cf845f6cf205105264f2d556eb245b0c8a8074: Status 404 returned error can't find the container with id aeebb200a544ddbccce8a86cf9cf845f6cf205105264f2d556eb245b0c8a8074 Apr 23 16:35:18.542999 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:18.542971 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6402b5e4dc46963653aa05278c9bac43.slice/crio-9898b62cebd3544e8b9326e25e1ff0c475071efbf49964a91902d5fa0144ec47 WatchSource:0}: Error finding container 9898b62cebd3544e8b9326e25e1ff0c475071efbf49964a91902d5fa0144ec47: Status 404 returned error can't find the container with id 9898b62cebd3544e8b9326e25e1ff0c475071efbf49964a91902d5fa0144ec47 Apr 23 16:35:18.547954 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.547937 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:35:18.576541 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:18.576516 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-27.ec2.internal\" not found" Apr 23 16:35:18.584498 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.584476 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:18.669930 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.669851 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" Apr 23 16:35:18.684534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.684514 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:18.685375 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.685363 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" Apr 23 16:35:18.694439 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:18.694421 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:19.344628 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.344595 2571 apiserver.go:52] "Watching apiserver" Apr 23 16:35:19.354079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.354056 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:35:19.356483 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.356456 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jbv88","kube-system/konnectivity-agent-jdgl4","kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal","openshift-cluster-node-tuning-operator/tuned-4kpgn","openshift-dns/node-resolver-6stz5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal","openshift-multus/network-metrics-daemon-hsxbc","openshift-network-diagnostics/network-check-target-vrntx","openshift-network-operator/iptables-alerter-tft7d","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t","openshift-image-registry/node-ca-d2k5z","openshift-multus/multus-additional-cni-plugins-q7tsv","openshift-multus/multus-qgbqb"] Apr 23 16:35:19.358842 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.358820 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:19.358936 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:19.358905 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:19.359971 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.359952 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:19.361453 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.361433 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.362770 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.362751 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.364267 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.364249 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.366809 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.366789 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:19.366912 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:19.366853 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:19.368780 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.368759 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.369317 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.369293 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:35:19.369418 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.369399 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:19.369475 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.369428 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zqfld\"" Apr 23 16:35:19.369542 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.369524 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:35:19.370492 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.370473 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:35:19.370641 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.370625 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:35:19.370781 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.370760 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:35:19.370875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.370818 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:35:19.370875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.370825 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t6kth\"" Apr 23 16:35:19.370875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.370829 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:35:19.371111 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.371088 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7vv6n\"" Apr 23 16:35:19.371778 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.371762 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:19.371874 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.371800 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zwkbr\"" Apr 23 16:35:19.371967 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.371950 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.372129 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.372113 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.373025 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.373004 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:35:19.373310 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.373294 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9kgt9\"" Apr 23 16:35:19.373390 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.373310 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:35:19.373390 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.373324 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:19.373390 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.373361 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:35:19.373390 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.373310 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:19.373647 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.373631 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.374385 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.374364 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:35:19.375109 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.375092 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.377343 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.377327 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:35:19.377476 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.377448 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:35:19.377550 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.377498 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bjgcj\"" Apr 23 16:35:19.378733 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.378715 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:35:19.378836 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.378775 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2tgsr\"" Apr 23 16:35:19.378836 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.378722 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:35:19.378940 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.378780 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:35:19.378940 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.378717 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:35:19.381440 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-run-openvswitch\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.381535 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-log-socket\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.381535 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381462 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-cni-netd\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.381535 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381476 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8fbb1f0d-a155-467a-a35d-78efabc77a02-tmp\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.381535 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381496 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlxdj\" (UniqueName: \"kubernetes.io/projected/9f25a094-e342-4690-8028-f1a3ddd77829-kube-api-access-dlxdj\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:19.381535 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381517 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-kubelet\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381559 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381592 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c4113f8-445a-41a3-afe0-4d920d77c9c9-ovn-node-metrics-cert\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/00650519-0327-483d-83cf-59b7e20fd1f5-agent-certs\") pod \"konnectivity-agent-jdgl4\" (UID: \"00650519-0327-483d-83cf-59b7e20fd1f5\") " pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381639 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-sysctl-conf\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381660 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33e0f7f2-93d8-459f-9c61-240a8cdad803-tmp-dir\") pod \"node-resolver-6stz5\" (UID: \"33e0f7f2-93d8-459f-9c61-240a8cdad803\") " pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thqjv\" (UniqueName: \"kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv\") pod \"network-check-target-vrntx\" (UID: \"7c952840-e4f8-4b49-9f90-0d7aa2618091\") " pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-run-netns\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381776 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c4113f8-445a-41a3-afe0-4d920d77c9c9-ovnkube-script-lib\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-registration-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-kubernetes\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.381816 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-sys\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.382252 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-lib-modules\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.382252 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381849 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59w65\" (UniqueName: \"kubernetes.io/projected/8fbb1f0d-a155-467a-a35d-78efabc77a02-kube-api-access-59w65\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.382252 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381876 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-etc-selinux\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.382384 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.381914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-sys-fs\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.382417 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382380 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-sysctl-d\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.382417 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382406 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-tuned\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.382490 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33e0f7f2-93d8-459f-9c61-240a8cdad803-hosts-file\") pod \"node-resolver-6stz5\" (UID: \"33e0f7f2-93d8-459f-9c61-240a8cdad803\") " pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.382490 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382439 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6xw7\" (UniqueName: \"kubernetes.io/projected/33e0f7f2-93d8-459f-9c61-240a8cdad803-kube-api-access-t6xw7\") pod \"node-resolver-6stz5\" (UID: \"33e0f7f2-93d8-459f-9c61-240a8cdad803\") " pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.382490 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c4113f8-445a-41a3-afe0-4d920d77c9c9-env-overrides\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.382490 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382468 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-socket-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.382490 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382483 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:19.382632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382497 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs5hl\" (UniqueName: \"kubernetes.io/projected/35c9d122-d374-4664-b013-c9bdfd5b8759-kube-api-access-hs5hl\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.382632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382512 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-modprobe-d\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.382632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-systemd\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.382632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382538 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-slash\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.382632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382551 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-run-systemd\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.382632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-etc-openvswitch\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.382632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382584 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-run-ovn\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.382632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.382632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382633 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-run\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382646 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-var-lib-openvswitch\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382660 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-node-log\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-cni-bin\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c4113f8-445a-41a3-afe0-4d920d77c9c9-ovnkube-config\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382726 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5jp4\" (UniqueName: \"kubernetes.io/projected/9c4113f8-445a-41a3-afe0-4d920d77c9c9-kube-api-access-k5jp4\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382739 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/00650519-0327-483d-83cf-59b7e20fd1f5-konnectivity-ca\") pod \"konnectivity-agent-jdgl4\" (UID: \"00650519-0327-483d-83cf-59b7e20fd1f5\") " pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-sysconfig\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382764 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9c427934-9049-45a2-bbd8-6cb89f9149e2-iptables-alerter-script\") pod \"iptables-alerter-tft7d\" (UID: \"9c427934-9049-45a2-bbd8-6cb89f9149e2\") " pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382786 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-device-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382828 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-var-lib-kubelet\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382841 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-host\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382853 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c427934-9049-45a2-bbd8-6cb89f9149e2-host-slash\") pod \"iptables-alerter-tft7d\" (UID: \"9c427934-9049-45a2-bbd8-6cb89f9149e2\") " pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382874 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82pkr\" (UniqueName: \"kubernetes.io/projected/9c427934-9049-45a2-bbd8-6cb89f9149e2-kube-api-access-82pkr\") pod \"iptables-alerter-tft7d\" (UID: \"9c427934-9049-45a2-bbd8-6cb89f9149e2\") " pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.383105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.382892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-systemd-units\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.383806 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.383246 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:35:19.383806 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.383301 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:35:19.383806 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.383453 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:35:19.383806 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.383495 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:35:19.388251 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.386095 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:35:19.388251 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.386476 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:35:19.388251 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.386915 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l87hb\"" Apr 23 16:35:19.388251 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.387170 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mz2r4\"" Apr 23 16:35:19.422865 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.422836 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:18 +0000 UTC" deadline="2027-10-03 23:11:07.942512843 +0000 UTC" Apr 23 16:35:19.422865 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.422865 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12678h35m48.519651606s" Apr 23 16:35:19.470839 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.470811 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:35:19.483836 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.483806 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-systemd-units\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.483970 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.483846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-cni-netd\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.483970 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.483874 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8fbb1f0d-a155-467a-a35d-78efabc77a02-tmp\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.484257 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484239 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7174271c-a85b-4c6d-872b-f2b384da443b-serviceca\") pod \"node-ca-d2k5z\" (UID: \"7174271c-a85b-4c6d-872b-f2b384da443b\") " pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.484378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31d4c6c2-69cd-4240-b617-6cc884b17481-cni-binary-copy\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.484378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484282 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-run-multus-certs\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.484378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.483937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-systemd-units\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.484378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484307 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-kubelet\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.484378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.483938 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-cni-netd\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.484378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484185 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:35:19.484378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484354 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.484378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-kubelet\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.484674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484422 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.484674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/00650519-0327-483d-83cf-59b7e20fd1f5-agent-certs\") pod \"konnectivity-agent-jdgl4\" (UID: \"00650519-0327-483d-83cf-59b7e20fd1f5\") " pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:19.484674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-sysctl-conf\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.484674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484556 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.484674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484588 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-cni-binary-copy\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.484674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484624 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-kubernetes\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.484674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484648 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59w65\" (UniqueName: \"kubernetes.io/projected/8fbb1f0d-a155-467a-a35d-78efabc77a02-kube-api-access-59w65\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.484674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-system-cni-dir\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484682 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-var-lib-cni-bin\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484771 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-kubernetes\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-etc-selinux\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484810 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-sys-fs\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484851 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-sysctl-d\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484858 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-sysctl-conf\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484873 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-etc-selinux\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484913 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-sys-fs\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484943 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6xw7\" (UniqueName: \"kubernetes.io/projected/33e0f7f2-93d8-459f-9c61-240a8cdad803-kube-api-access-t6xw7\") pod \"node-resolver-6stz5\" (UID: \"33e0f7f2-93d8-459f-9c61-240a8cdad803\") " pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.484987 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-cni-dir\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-socket-dir-parent\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-sysctl-d\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-var-lib-cni-multus\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-daemon-config\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.485089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485081 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485103 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-systemd\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485125 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-slash\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7174271c-a85b-4c6d-872b-f2b384da443b-host\") pod \"node-ca-d2k5z\" (UID: \"7174271c-a85b-4c6d-872b-f2b384da443b\") " pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485176 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-os-release\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485179 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-systemd\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:19.485201 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485201 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31d4c6c2-69cd-4240-b617-6cc884b17481-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485247 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-os-release\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485275 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-var-lib-openvswitch\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-slash\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-node-log\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-cni-bin\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485404 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-cni-bin\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485436 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-var-lib-openvswitch\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:19.485476 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs podName:9f25a094-e342-4690-8028-f1a3ddd77829 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:19.985372423 +0000 UTC m=+3.066814984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs") pod "network-metrics-daemon-hsxbc" (UID: "9f25a094-e342-4690-8028-f1a3ddd77829") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485476 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-node-log\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.485662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5jp4\" (UniqueName: \"kubernetes.io/projected/9c4113f8-445a-41a3-afe0-4d920d77c9c9-kube-api-access-k5jp4\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/00650519-0327-483d-83cf-59b7e20fd1f5-konnectivity-ca\") pod \"konnectivity-agent-jdgl4\" (UID: \"00650519-0327-483d-83cf-59b7e20fd1f5\") " pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-sysconfig\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9c427934-9049-45a2-bbd8-6cb89f9149e2-iptables-alerter-script\") pod \"iptables-alerter-tft7d\" (UID: \"9c427934-9049-45a2-bbd8-6cb89f9149e2\") " pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485627 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffqsl\" (UniqueName: \"kubernetes.io/projected/31d4c6c2-69cd-4240-b617-6cc884b17481-kube-api-access-ffqsl\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485675 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-device-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485735 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-var-lib-kubelet\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-host\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c427934-9049-45a2-bbd8-6cb89f9149e2-host-slash\") pod \"iptables-alerter-tft7d\" (UID: \"9c427934-9049-45a2-bbd8-6cb89f9149e2\") " pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-run-netns\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-run-openvswitch\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485863 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-log-socket\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33e0f7f2-93d8-459f-9c61-240a8cdad803-tmp-dir\") pod \"node-resolver-6stz5\" (UID: \"33e0f7f2-93d8-459f-9c61-240a8cdad803\") " pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7r8d\" (UniqueName: \"kubernetes.io/projected/7174271c-a85b-4c6d-872b-f2b384da443b-kube-api-access-z7r8d\") pod \"node-ca-d2k5z\" (UID: \"7174271c-a85b-4c6d-872b-f2b384da443b\") " pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485939 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31d4c6c2-69cd-4240-b617-6cc884b17481-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485968 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlxdj\" (UniqueName: \"kubernetes.io/projected/9f25a094-e342-4690-8028-f1a3ddd77829-kube-api-access-dlxdj\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:19.486157 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.485997 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c4113f8-445a-41a3-afe0-4d920d77c9c9-ovn-node-metrics-cert\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486022 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqjv\" (UniqueName: \"kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv\") pod \"network-check-target-vrntx\" (UID: \"7c952840-e4f8-4b49-9f90-0d7aa2618091\") " pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-system-cni-dir\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9c427934-9049-45a2-bbd8-6cb89f9149e2-iptables-alerter-script\") pod \"iptables-alerter-tft7d\" (UID: \"9c427934-9049-45a2-bbd8-6cb89f9149e2\") " pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-var-lib-kubelet\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486131 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-conf-dir\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486155 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-run-netns\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c4113f8-445a-41a3-afe0-4d920d77c9c9-ovnkube-script-lib\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-registration-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-sys\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-lib-modules\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486274 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-hostroot\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97dst\" (UniqueName: \"kubernetes.io/projected/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-kube-api-access-97dst\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-tuned\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33e0f7f2-93d8-459f-9c61-240a8cdad803-hosts-file\") pod \"node-resolver-6stz5\" (UID: \"33e0f7f2-93d8-459f-9c61-240a8cdad803\") " pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-run-k8s-cni-cncf-io\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486423 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c4113f8-445a-41a3-afe0-4d920d77c9c9-env-overrides\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.486618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486447 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-socket-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486451 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/00650519-0327-483d-83cf-59b7e20fd1f5-konnectivity-ca\") pod \"konnectivity-agent-jdgl4\" (UID: \"00650519-0327-483d-83cf-59b7e20fd1f5\") " pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486482 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5hl\" (UniqueName: \"kubernetes.io/projected/35c9d122-d374-4664-b013-c9bdfd5b8759-kube-api-access-hs5hl\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486485 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-sysconfig\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486515 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-modprobe-d\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486569 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-socket-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486576 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-modprobe-d\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486652 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-run-netns\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-cnibin\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-run-systemd\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486774 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-etc-openvswitch\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-run-ovn\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486820 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-device-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486824 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486854 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-var-lib-kubelet\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-run\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.487118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486882 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-cnibin\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486904 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-etc-kubernetes\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486936 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-etc-openvswitch\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c4113f8-445a-41a3-afe0-4d920d77c9c9-ovnkube-config\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82pkr\" (UniqueName: \"kubernetes.io/projected/9c427934-9049-45a2-bbd8-6cb89f9149e2-kube-api-access-82pkr\") pod \"iptables-alerter-tft7d\" (UID: \"9c427934-9049-45a2-bbd8-6cb89f9149e2\") " pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.486903 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-run\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487028 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-run-openvswitch\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487075 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-log-socket\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487253 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c4113f8-445a-41a3-afe0-4d920d77c9c9-ovnkube-script-lib\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c4113f8-445a-41a3-afe0-4d920d77c9c9-ovnkube-config\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33e0f7f2-93d8-459f-9c61-240a8cdad803-tmp-dir\") pod \"node-resolver-6stz5\" (UID: \"33e0f7f2-93d8-459f-9c61-240a8cdad803\") " pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487387 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-run-ovn\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-registration-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487431 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-host\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487448 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-sys\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8fbb1f0d-a155-467a-a35d-78efabc77a02-lib-modules\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.487619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487514 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35c9d122-d374-4664-b013-c9bdfd5b8759-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.488308 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c427934-9049-45a2-bbd8-6cb89f9149e2-host-slash\") pod \"iptables-alerter-tft7d\" (UID: \"9c427934-9049-45a2-bbd8-6cb89f9149e2\") " pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.488308 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c4113f8-445a-41a3-afe0-4d920d77c9c9-run-systemd\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.488308 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.487908 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33e0f7f2-93d8-459f-9c61-240a8cdad803-hosts-file\") pod \"node-resolver-6stz5\" (UID: \"33e0f7f2-93d8-459f-9c61-240a8cdad803\") " pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.488308 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.488199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8fbb1f0d-a155-467a-a35d-78efabc77a02-tmp\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.488526 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.488384 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/00650519-0327-483d-83cf-59b7e20fd1f5-agent-certs\") pod \"konnectivity-agent-jdgl4\" (UID: \"00650519-0327-483d-83cf-59b7e20fd1f5\") " pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:19.489404 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.489364 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c4113f8-445a-41a3-afe0-4d920d77c9c9-env-overrides\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.489901 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.489862 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8fbb1f0d-a155-467a-a35d-78efabc77a02-etc-tuned\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.490198 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.490178 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c4113f8-445a-41a3-afe0-4d920d77c9c9-ovn-node-metrics-cert\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.494774 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:19.494493 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:19.494774 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:19.494514 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:19.494774 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:19.494527 2571 projected.go:194] Error preparing data for projected volume kube-api-access-thqjv for pod openshift-network-diagnostics/network-check-target-vrntx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:19.494774 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:19.494605 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv podName:7c952840-e4f8-4b49-9f90-0d7aa2618091 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:19.994587815 +0000 UTC m=+3.076030391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-thqjv" (UniqueName: "kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv") pod "network-check-target-vrntx" (UID: "7c952840-e4f8-4b49-9f90-0d7aa2618091") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:19.495370 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.495349 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59w65\" (UniqueName: \"kubernetes.io/projected/8fbb1f0d-a155-467a-a35d-78efabc77a02-kube-api-access-59w65\") pod \"tuned-4kpgn\" (UID: \"8fbb1f0d-a155-467a-a35d-78efabc77a02\") " pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.496230 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.495795 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6xw7\" (UniqueName: \"kubernetes.io/projected/33e0f7f2-93d8-459f-9c61-240a8cdad803-kube-api-access-t6xw7\") pod \"node-resolver-6stz5\" (UID: \"33e0f7f2-93d8-459f-9c61-240a8cdad803\") " pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.496825 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.496358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5jp4\" (UniqueName: \"kubernetes.io/projected/9c4113f8-445a-41a3-afe0-4d920d77c9c9-kube-api-access-k5jp4\") pod \"ovnkube-node-jbv88\" (UID: \"9c4113f8-445a-41a3-afe0-4d920d77c9c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.497531 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.497507 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs5hl\" (UniqueName: \"kubernetes.io/projected/35c9d122-d374-4664-b013-c9bdfd5b8759-kube-api-access-hs5hl\") pod \"aws-ebs-csi-driver-node-vv85t\" (UID: \"35c9d122-d374-4664-b013-c9bdfd5b8759\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.497635 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.497610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlxdj\" (UniqueName: \"kubernetes.io/projected/9f25a094-e342-4690-8028-f1a3ddd77829-kube-api-access-dlxdj\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:19.498615 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.498594 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82pkr\" (UniqueName: \"kubernetes.io/projected/9c427934-9049-45a2-bbd8-6cb89f9149e2-kube-api-access-82pkr\") pod \"iptables-alerter-tft7d\" (UID: \"9c427934-9049-45a2-bbd8-6cb89f9149e2\") " pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.510386 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.510346 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" event={"ID":"a7b645763bd4d0264284f3e94b95b589","Type":"ContainerStarted","Data":"aeebb200a544ddbccce8a86cf9cf845f6cf205105264f2d556eb245b0c8a8074"} Apr 23 16:35:19.511335 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.511312 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" event={"ID":"6402b5e4dc46963653aa05278c9bac43","Type":"ContainerStarted","Data":"9898b62cebd3544e8b9326e25e1ff0c475071efbf49964a91902d5fa0144ec47"} Apr 23 16:35:19.588781 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffqsl\" (UniqueName: \"kubernetes.io/projected/31d4c6c2-69cd-4240-b617-6cc884b17481-kube-api-access-ffqsl\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.588954 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-run-netns\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.588954 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588828 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7r8d\" (UniqueName: \"kubernetes.io/projected/7174271c-a85b-4c6d-872b-f2b384da443b-kube-api-access-z7r8d\") pod \"node-ca-d2k5z\" (UID: \"7174271c-a85b-4c6d-872b-f2b384da443b\") " pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.588954 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588854 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31d4c6c2-69cd-4240-b617-6cc884b17481-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.588954 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588891 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-system-cni-dir\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.588954 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-var-lib-kubelet\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.588954 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588919 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-run-netns\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.588954 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588939 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-conf-dir\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588978 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-var-lib-kubelet\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588989 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-hostroot\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.588979 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-system-cni-dir\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97dst\" (UniqueName: \"kubernetes.io/projected/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-kube-api-access-97dst\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589022 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-hostroot\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-run-k8s-cni-cncf-io\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589078 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-cnibin\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-run-k8s-cni-cncf-io\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589147 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-conf-dir\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-cnibin\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589163 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-cnibin\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589182 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-etc-kubernetes\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7174271c-a85b-4c6d-872b-f2b384da443b-serviceca\") pod \"node-ca-d2k5z\" (UID: \"7174271c-a85b-4c6d-872b-f2b384da443b\") " pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589217 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-cnibin\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31d4c6c2-69cd-4240-b617-6cc884b17481-cni-binary-copy\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-run-multus-certs\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589255 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-etc-kubernetes\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.589298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589269 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589309 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-run-multus-certs\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-cni-binary-copy\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589378 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-system-cni-dir\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589390 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589419 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-system-cni-dir\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-var-lib-cni-bin\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-cni-dir\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589492 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-socket-dir-parent\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589503 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-var-lib-cni-bin\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589558 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-socket-dir-parent\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589565 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-cni-dir\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-var-lib-cni-multus\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-daemon-config\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7174271c-a85b-4c6d-872b-f2b384da443b-serviceca\") pod \"node-ca-d2k5z\" (UID: \"7174271c-a85b-4c6d-872b-f2b384da443b\") " pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-host-var-lib-cni-multus\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7174271c-a85b-4c6d-872b-f2b384da443b-host\") pod \"node-ca-d2k5z\" (UID: \"7174271c-a85b-4c6d-872b-f2b384da443b\") " pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.590091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589685 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-os-release\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.590793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31d4c6c2-69cd-4240-b617-6cc884b17481-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.590793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589758 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-os-release\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7174271c-a85b-4c6d-872b-f2b384da443b-host\") pod \"node-ca-d2k5z\" (UID: \"7174271c-a85b-4c6d-872b-f2b384da443b\") " pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.590793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589840 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31d4c6c2-69cd-4240-b617-6cc884b17481-os-release\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.590793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589896 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-os-release\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589915 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-cni-binary-copy\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.589988 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31d4c6c2-69cd-4240-b617-6cc884b17481-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.590793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.590044 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31d4c6c2-69cd-4240-b617-6cc884b17481-cni-binary-copy\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.590793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.590616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-multus-daemon-config\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.590793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.590665 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31d4c6c2-69cd-4240-b617-6cc884b17481-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.611771 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.611668 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7r8d\" (UniqueName: \"kubernetes.io/projected/7174271c-a85b-4c6d-872b-f2b384da443b-kube-api-access-z7r8d\") pod \"node-ca-d2k5z\" (UID: \"7174271c-a85b-4c6d-872b-f2b384da443b\") " pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.613932 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.613911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97dst\" (UniqueName: \"kubernetes.io/projected/9fbf01c4-8974-4a69-881e-b57e55f7b1f1-kube-api-access-97dst\") pod \"multus-qgbqb\" (UID: \"9fbf01c4-8974-4a69-881e-b57e55f7b1f1\") " pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.614050 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.613987 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffqsl\" (UniqueName: \"kubernetes.io/projected/31d4c6c2-69cd-4240-b617-6cc884b17481-kube-api-access-ffqsl\") pod \"multus-additional-cni-plugins-q7tsv\" (UID: \"31d4c6c2-69cd-4240-b617-6cc884b17481\") " pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.671945 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.671915 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:19.679665 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.679639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" Apr 23 16:35:19.691413 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.691389 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6stz5" Apr 23 16:35:19.696938 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.696913 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:19.703522 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.703506 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tft7d" Apr 23 16:35:19.710126 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.710103 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" Apr 23 16:35:19.716668 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.716647 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d2k5z" Apr 23 16:35:19.724152 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.724134 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" Apr 23 16:35:19.727745 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.727726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qgbqb" Apr 23 16:35:19.860460 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.860432 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:19.993268 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:19.993179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:19.993420 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:19.993350 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:19.993462 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:19.993423 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs podName:9f25a094-e342-4690-8028-f1a3ddd77829 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:20.993408216 +0000 UTC m=+4.074850767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs") pod "network-metrics-daemon-hsxbc" (UID: "9f25a094-e342-4690-8028-f1a3ddd77829") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:20.082027 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:20.081937 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fbb1f0d_a155_467a_a35d_78efabc77a02.slice/crio-e14cb8584e5d1112ef2b1da2bd5520f1748c4defc0406ae8a5c8544246331482 WatchSource:0}: Error finding container e14cb8584e5d1112ef2b1da2bd5520f1748c4defc0406ae8a5c8544246331482: Status 404 returned error can't find the container with id e14cb8584e5d1112ef2b1da2bd5520f1748c4defc0406ae8a5c8544246331482 Apr 23 16:35:20.083808 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:20.083781 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c9d122_d374_4664_b013_c9bdfd5b8759.slice/crio-6c375e0cdf23a8d8f8168257319126e4013dfebf44620b6d2b347f6be3ed3adb WatchSource:0}: Error finding container 6c375e0cdf23a8d8f8168257319126e4013dfebf44620b6d2b347f6be3ed3adb: Status 404 returned error can't find the container with id 6c375e0cdf23a8d8f8168257319126e4013dfebf44620b6d2b347f6be3ed3adb Apr 23 16:35:20.087076 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:20.087050 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c4113f8_445a_41a3_afe0_4d920d77c9c9.slice/crio-696085aa958fc3f0462834c675bf7bf37a2ac36358ce3ec875db2bfdc67ecd16 WatchSource:0}: Error finding container 696085aa958fc3f0462834c675bf7bf37a2ac36358ce3ec875db2bfdc67ecd16: Status 404 returned error can't find the container with id 696085aa958fc3f0462834c675bf7bf37a2ac36358ce3ec875db2bfdc67ecd16 Apr 23 16:35:20.089056 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:20.088982 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00650519_0327_483d_83cf_59b7e20fd1f5.slice/crio-7228a4122e9c5ef7f8e1abf91e279b7a7e8ca5c10e0b2732c0bb62d800713b23 WatchSource:0}: Error finding container 7228a4122e9c5ef7f8e1abf91e279b7a7e8ca5c10e0b2732c0bb62d800713b23: Status 404 returned error can't find the container with id 7228a4122e9c5ef7f8e1abf91e279b7a7e8ca5c10e0b2732c0bb62d800713b23 Apr 23 16:35:20.090191 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:20.090147 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c427934_9049_45a2_bbd8_6cb89f9149e2.slice/crio-c79e367ee8732bf644a00371d34e64dc7caa8e69eecaada4bcaddfa8c7df74b9 WatchSource:0}: Error finding container c79e367ee8732bf644a00371d34e64dc7caa8e69eecaada4bcaddfa8c7df74b9: Status 404 returned error can't find the container with id c79e367ee8732bf644a00371d34e64dc7caa8e69eecaada4bcaddfa8c7df74b9 Apr 23 16:35:20.091405 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:20.091382 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fbf01c4_8974_4a69_881e_b57e55f7b1f1.slice/crio-a33474281e1df26b7053e3b2b40fad257e4953986d2de2c438e86de7ac1592cc WatchSource:0}: Error finding container a33474281e1df26b7053e3b2b40fad257e4953986d2de2c438e86de7ac1592cc: Status 404 returned error can't find the container with id a33474281e1df26b7053e3b2b40fad257e4953986d2de2c438e86de7ac1592cc Apr 23 16:35:20.092592 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:35:20.092468 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7174271c_a85b_4c6d_872b_f2b384da443b.slice/crio-6e08895350cea7dd2bbad4646e0b44a6803ce9ca75f4930cb507ac4b9bea671a WatchSource:0}: Error finding container 6e08895350cea7dd2bbad4646e0b44a6803ce9ca75f4930cb507ac4b9bea671a: Status 404 returned error can't find the container with id 6e08895350cea7dd2bbad4646e0b44a6803ce9ca75f4930cb507ac4b9bea671a Apr 23 16:35:20.094192 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.093552 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqjv\" (UniqueName: \"kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv\") pod \"network-check-target-vrntx\" (UID: \"7c952840-e4f8-4b49-9f90-0d7aa2618091\") " pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:20.094192 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:20.093719 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:20.094192 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:20.093751 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:20.094192 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:20.093766 2571 projected.go:194] Error preparing data for projected volume kube-api-access-thqjv for pod openshift-network-diagnostics/network-check-target-vrntx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:20.094192 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:20.093818 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv podName:7c952840-e4f8-4b49-9f90-0d7aa2618091 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:21.093798382 +0000 UTC m=+4.175240931 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqjv" (UniqueName: "kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv") pod "network-check-target-vrntx" (UID: "7c952840-e4f8-4b49-9f90-0d7aa2618091") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:20.423255 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.423174 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:18 +0000 UTC" deadline="2027-09-19 04:07:29.833447461 +0000 UTC" Apr 23 16:35:20.423255 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.423214 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12323h32m9.4102357s" Apr 23 16:35:20.521084 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.521043 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d2k5z" event={"ID":"7174271c-a85b-4c6d-872b-f2b384da443b","Type":"ContainerStarted","Data":"6e08895350cea7dd2bbad4646e0b44a6803ce9ca75f4930cb507ac4b9bea671a"} Apr 23 16:35:20.524346 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.524302 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qgbqb" event={"ID":"9fbf01c4-8974-4a69-881e-b57e55f7b1f1","Type":"ContainerStarted","Data":"a33474281e1df26b7053e3b2b40fad257e4953986d2de2c438e86de7ac1592cc"} Apr 23 16:35:20.530270 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.530238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jdgl4" event={"ID":"00650519-0327-483d-83cf-59b7e20fd1f5","Type":"ContainerStarted","Data":"7228a4122e9c5ef7f8e1abf91e279b7a7e8ca5c10e0b2732c0bb62d800713b23"} Apr 23 16:35:20.537252 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.537197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" event={"ID":"35c9d122-d374-4664-b013-c9bdfd5b8759","Type":"ContainerStarted","Data":"6c375e0cdf23a8d8f8168257319126e4013dfebf44620b6d2b347f6be3ed3adb"} Apr 23 16:35:20.543924 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.543112 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" event={"ID":"6402b5e4dc46963653aa05278c9bac43","Type":"ContainerStarted","Data":"5d6582ed6295c30496ff15e7e06ec7e6449bbc2fc9df544385ac11bafea8db06"} Apr 23 16:35:20.553554 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.553527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tft7d" event={"ID":"9c427934-9049-45a2-bbd8-6cb89f9149e2","Type":"ContainerStarted","Data":"c79e367ee8732bf644a00371d34e64dc7caa8e69eecaada4bcaddfa8c7df74b9"} Apr 23 16:35:20.565128 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.564187 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" event={"ID":"31d4c6c2-69cd-4240-b617-6cc884b17481","Type":"ContainerStarted","Data":"52e9964711da55a2f5dfa2f7f4b61ab0dd1590b16ee7be1734769e046b3b1667"} Apr 23 16:35:20.567519 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.567467 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" event={"ID":"9c4113f8-445a-41a3-afe0-4d920d77c9c9","Type":"ContainerStarted","Data":"696085aa958fc3f0462834c675bf7bf37a2ac36358ce3ec875db2bfdc67ecd16"} Apr 23 16:35:20.569613 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.569590 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" event={"ID":"8fbb1f0d-a155-467a-a35d-78efabc77a02","Type":"ContainerStarted","Data":"e14cb8584e5d1112ef2b1da2bd5520f1748c4defc0406ae8a5c8544246331482"} Apr 23 16:35:20.570903 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:20.570881 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6stz5" event={"ID":"33e0f7f2-93d8-459f-9c61-240a8cdad803","Type":"ContainerStarted","Data":"2c3af2d7a778116be180bfa59803fcf3d2f9ef99d174b13afc500c56c015533d"} Apr 23 16:35:21.000159 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:21.000123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:21.000321 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:21.000290 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:21.000378 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:21.000350 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs podName:9f25a094-e342-4690-8028-f1a3ddd77829 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:23.000330543 +0000 UTC m=+6.081773120 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs") pod "network-metrics-daemon-hsxbc" (UID: "9f25a094-e342-4690-8028-f1a3ddd77829") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:21.002884 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:21.002844 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:21.100876 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:21.100833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqjv\" (UniqueName: \"kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv\") pod \"network-check-target-vrntx\" (UID: \"7c952840-e4f8-4b49-9f90-0d7aa2618091\") " pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:21.101041 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:21.101025 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:21.101106 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:21.101049 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:21.101106 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:21.101062 2571 projected.go:194] Error preparing data for projected volume kube-api-access-thqjv for pod openshift-network-diagnostics/network-check-target-vrntx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:21.101208 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:21.101122 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv podName:7c952840-e4f8-4b49-9f90-0d7aa2618091 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:23.101101315 +0000 UTC m=+6.182543865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqjv" (UniqueName: "kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv") pod "network-check-target-vrntx" (UID: "7c952840-e4f8-4b49-9f90-0d7aa2618091") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:21.194912 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:21.194884 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:21.509467 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:21.509418 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:21.509849 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:21.509536 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:21.510252 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:21.510192 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:21.510346 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:21.510309 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:21.589487 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:21.589403 2571 generic.go:358] "Generic (PLEG): container finished" podID="a7b645763bd4d0264284f3e94b95b589" containerID="7c3255466173db36f30b7dc8554f68f8b92c5b3cb69b782cb276c5368997e9d8" exitCode=0 Apr 23 16:35:21.589643 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:21.589579 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" event={"ID":"a7b645763bd4d0264284f3e94b95b589","Type":"ContainerDied","Data":"7c3255466173db36f30b7dc8554f68f8b92c5b3cb69b782cb276c5368997e9d8"} Apr 23 16:35:21.605289 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:21.605241 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-27.ec2.internal" podStartSLOduration=3.605222111 podStartE2EDuration="3.605222111s" podCreationTimestamp="2026-04-23 16:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:20.563794279 +0000 UTC m=+3.645236840" watchObservedRunningTime="2026-04-23 16:35:21.605222111 +0000 UTC m=+4.686664684" Apr 23 16:35:22.594413 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:22.593766 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" event={"ID":"a7b645763bd4d0264284f3e94b95b589","Type":"ContainerStarted","Data":"333d7650e5856fc8dc98edc60f1149266a414c6b9009c5c7db00677e9ab05d5b"} Apr 23 16:35:23.016102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:23.015832 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:23.016102 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:23.016061 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:23.016320 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:23.016147 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs podName:9f25a094-e342-4690-8028-f1a3ddd77829 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.016125853 +0000 UTC m=+10.097568418 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs") pod "network-metrics-daemon-hsxbc" (UID: "9f25a094-e342-4690-8028-f1a3ddd77829") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:23.116805 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:23.116754 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqjv\" (UniqueName: \"kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv\") pod \"network-check-target-vrntx\" (UID: \"7c952840-e4f8-4b49-9f90-0d7aa2618091\") " pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:23.116943 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:23.116905 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:23.116943 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:23.116924 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:23.116943 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:23.116936 2571 projected.go:194] Error preparing data for projected volume kube-api-access-thqjv for pod openshift-network-diagnostics/network-check-target-vrntx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:23.117075 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:23.116996 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv podName:7c952840-e4f8-4b49-9f90-0d7aa2618091 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.116976965 +0000 UTC m=+10.198419513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqjv" (UniqueName: "kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv") pod "network-check-target-vrntx" (UID: "7c952840-e4f8-4b49-9f90-0d7aa2618091") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:23.507544 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:23.507461 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:23.507711 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:23.507565 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:23.507886 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:23.507867 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:23.508030 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:23.508008 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:25.508913 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:25.508877 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:25.509345 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:25.509003 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:25.509345 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:25.509152 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:25.509345 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:25.509238 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:27.048185 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:27.047934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:27.048185 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:27.048078 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:27.048185 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:27.048143 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs podName:9f25a094-e342-4690-8028-f1a3ddd77829 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:35.048123908 +0000 UTC m=+18.129566479 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs") pod "network-metrics-daemon-hsxbc" (UID: "9f25a094-e342-4690-8028-f1a3ddd77829") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:27.148722 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:27.148456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqjv\" (UniqueName: \"kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv\") pod \"network-check-target-vrntx\" (UID: \"7c952840-e4f8-4b49-9f90-0d7aa2618091\") " pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:27.148722 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:27.148590 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:27.148722 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:27.148613 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:27.148722 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:27.148626 2571 projected.go:194] Error preparing data for projected volume kube-api-access-thqjv for pod openshift-network-diagnostics/network-check-target-vrntx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:27.148722 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:27.148686 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv podName:7c952840-e4f8-4b49-9f90-0d7aa2618091 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:35.148666863 +0000 UTC m=+18.230109414 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqjv" (UniqueName: "kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv") pod "network-check-target-vrntx" (UID: "7c952840-e4f8-4b49-9f90-0d7aa2618091") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:27.508039 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:27.507962 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:27.508212 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:27.508091 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:27.508212 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:27.508129 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:27.508327 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:27.508208 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:29.507098 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:29.507065 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:29.507098 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:29.507087 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:29.507584 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:29.507187 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:29.507584 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:29.507327 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:31.507213 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:31.507039 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:31.507213 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:31.507099 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:31.507213 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:31.507209 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:31.507835 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:31.507352 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:33.506532 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:33.506501 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:33.507018 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:33.506504 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:33.507018 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:33.506630 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:33.507018 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:33.506732 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:35.106751 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:35.106690 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:35.107177 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:35.106846 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:35.107177 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:35.106920 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs podName:9f25a094-e342-4690-8028-f1a3ddd77829 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:51.106899855 +0000 UTC m=+34.188342414 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs") pod "network-metrics-daemon-hsxbc" (UID: "9f25a094-e342-4690-8028-f1a3ddd77829") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:35.207360 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:35.207320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqjv\" (UniqueName: \"kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv\") pod \"network-check-target-vrntx\" (UID: \"7c952840-e4f8-4b49-9f90-0d7aa2618091\") " pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:35.207518 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:35.207466 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:35.207518 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:35.207491 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:35.207518 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:35.207505 2571 projected.go:194] Error preparing data for projected volume kube-api-access-thqjv for pod openshift-network-diagnostics/network-check-target-vrntx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:35.207635 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:35.207572 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv podName:7c952840-e4f8-4b49-9f90-0d7aa2618091 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:51.20755343 +0000 UTC m=+34.288995980 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqjv" (UniqueName: "kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv") pod "network-check-target-vrntx" (UID: "7c952840-e4f8-4b49-9f90-0d7aa2618091") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:35.506542 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:35.506447 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:35.506710 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:35.506464 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:35.506710 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:35.506590 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:35.506710 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:35.506680 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:37.506906 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:37.506547 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:37.506906 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:37.506663 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:37.506906 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:37.506776 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:37.508615 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:37.507848 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:37.629674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:37.629592 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" event={"ID":"8fbb1f0d-a155-467a-a35d-78efabc77a02","Type":"ContainerStarted","Data":"7aab263d3ef02316325da773d58ab2d24584990960a160a9ea772012a44a381b"} Apr 23 16:35:37.631076 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:37.631040 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6stz5" event={"ID":"33e0f7f2-93d8-459f-9c61-240a8cdad803","Type":"ContainerStarted","Data":"2e79b89f41142b497ea5db097599a28afd8b7936220aa582ce3d482c5fb347a9"} Apr 23 16:35:37.632372 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:37.632350 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jdgl4" event={"ID":"00650519-0327-483d-83cf-59b7e20fd1f5","Type":"ContainerStarted","Data":"0994e67a3ffb8d39aa3c23e9a1f570dba1c7874d5ac70dbaaa2fae85dd6f5066"} Apr 23 16:35:37.633607 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:37.633578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" event={"ID":"35c9d122-d374-4664-b013-c9bdfd5b8759","Type":"ContainerStarted","Data":"c2b08144263242f86887feec83f4c5c06b894e652aee76212621019fa1748ec4"} Apr 23 16:35:37.659884 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:37.659833 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4kpgn" podStartSLOduration=3.3853633260000002 podStartE2EDuration="20.659821052s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:35:20.084832027 +0000 UTC m=+3.166274573" lastFinishedPulling="2026-04-23 16:35:37.359289747 +0000 UTC m=+20.440732299" observedRunningTime="2026-04-23 16:35:37.659423305 +0000 UTC m=+20.740865884" watchObservedRunningTime="2026-04-23 16:35:37.659821052 +0000 UTC m=+20.741263621" Apr 23 16:35:37.660052 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:37.660024 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-27.ec2.internal" podStartSLOduration=19.660016725 podStartE2EDuration="19.660016725s" podCreationTimestamp="2026-04-23 16:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:22.609092632 +0000 UTC m=+5.690535202" watchObservedRunningTime="2026-04-23 16:35:37.660016725 +0000 UTC m=+20.741459293" Apr 23 16:35:37.692406 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:37.692351 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6stz5" podStartSLOduration=3.429281913 podStartE2EDuration="20.69233398s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:35:20.096537503 +0000 UTC m=+3.177980051" lastFinishedPulling="2026-04-23 16:35:37.359589571 +0000 UTC m=+20.441032118" observedRunningTime="2026-04-23 16:35:37.692023294 +0000 UTC m=+20.773465896" watchObservedRunningTime="2026-04-23 16:35:37.69233398 +0000 UTC m=+20.773776550" Apr 23 16:35:37.693180 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:37.693142 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jdgl4" podStartSLOduration=3.425701745 podStartE2EDuration="20.693111087s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:35:20.091874942 +0000 UTC m=+3.173317493" lastFinishedPulling="2026-04-23 16:35:37.359284274 +0000 UTC m=+20.440726835" observedRunningTime="2026-04-23 16:35:37.675868476 +0000 UTC m=+20.757311056" watchObservedRunningTime="2026-04-23 16:35:37.693111087 +0000 UTC m=+20.774553657" Apr 23 16:35:38.636959 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.636726 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d2k5z" event={"ID":"7174271c-a85b-4c6d-872b-f2b384da443b","Type":"ContainerStarted","Data":"4a8adb155ea0e0ecb9327080907bc8cf5fc42e13a7a6a5724c2e444993c1e028"} Apr 23 16:35:38.638422 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.638396 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qgbqb" event={"ID":"9fbf01c4-8974-4a69-881e-b57e55f7b1f1","Type":"ContainerStarted","Data":"562841a3498c3877ed2132e8b2acb3e2aa375b2127aa1c21209e29798d34565d"} Apr 23 16:35:38.639552 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.639529 2571 generic.go:358] "Generic (PLEG): container finished" podID="31d4c6c2-69cd-4240-b617-6cc884b17481" containerID="dfc1c9085aa9629155b043111b98cb8232b75756ea6a862bb25dd7b7a5d95d01" exitCode=0 Apr 23 16:35:38.639718 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.639589 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" event={"ID":"31d4c6c2-69cd-4240-b617-6cc884b17481","Type":"ContainerDied","Data":"dfc1c9085aa9629155b043111b98cb8232b75756ea6a862bb25dd7b7a5d95d01"} Apr 23 16:35:38.641993 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.641969 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" event={"ID":"9c4113f8-445a-41a3-afe0-4d920d77c9c9","Type":"ContainerStarted","Data":"07c51e3c9f5c11b758694c2f1cdb73ef7a54c71e74662b06902fca4d47e53053"} Apr 23 16:35:38.642079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.641998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" event={"ID":"9c4113f8-445a-41a3-afe0-4d920d77c9c9","Type":"ContainerStarted","Data":"7df301cf47ca4e22f2c611ca99bbe7ce143eae758dd729673e85ed7adcf41e4c"} Apr 23 16:35:38.642079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.642008 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" event={"ID":"9c4113f8-445a-41a3-afe0-4d920d77c9c9","Type":"ContainerStarted","Data":"f447bd978c1fb9c99edb50202878f1125d2485689ca41c0f8a30c786178aee48"} Apr 23 16:35:38.642079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.642015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" event={"ID":"9c4113f8-445a-41a3-afe0-4d920d77c9c9","Type":"ContainerStarted","Data":"109170c64f3a35e077028b94bfe64e2cf9c55acc152a3a2fc9ab2260fbb3ca6d"} Apr 23 16:35:38.642079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.642038 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" event={"ID":"9c4113f8-445a-41a3-afe0-4d920d77c9c9","Type":"ContainerStarted","Data":"5da8710637b1426e32a5cc10596004aedac2192b5fdf0232cf0e89cb1fb2b27c"} Apr 23 16:35:38.642079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.642046 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" event={"ID":"9c4113f8-445a-41a3-afe0-4d920d77c9c9","Type":"ContainerStarted","Data":"3539fc5efeef10c8a7a8e29eb714f4bec2df49f7258d0ef4fa1d595b78ff6701"} Apr 23 16:35:38.686398 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.686353 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d2k5z" podStartSLOduration=4.422689521 podStartE2EDuration="21.686340375s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:35:20.095635417 +0000 UTC m=+3.177077965" lastFinishedPulling="2026-04-23 16:35:37.359286265 +0000 UTC m=+20.440728819" observedRunningTime="2026-04-23 16:35:38.655579772 +0000 UTC m=+21.737022342" watchObservedRunningTime="2026-04-23 16:35:38.686340375 +0000 UTC m=+21.767782951" Apr 23 16:35:38.711498 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.711456 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qgbqb" podStartSLOduration=4.002746339 podStartE2EDuration="21.711443343s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:35:20.09321774 +0000 UTC m=+3.174660286" lastFinishedPulling="2026-04-23 16:35:37.801914738 +0000 UTC m=+20.883357290" observedRunningTime="2026-04-23 16:35:38.711067827 +0000 UTC m=+21.792510393" watchObservedRunningTime="2026-04-23 16:35:38.711443343 +0000 UTC m=+21.792885911" Apr 23 16:35:38.838536 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:38.838510 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:39.459575 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:39.459462 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:38.838532681Z","UUID":"3cefbfe7-4b0b-4d74-a172-8620fcfe3306","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:39.462546 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:39.462517 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:39.462546 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:39.462553 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:39.507493 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:39.507465 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:39.507674 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:39.507603 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:39.509026 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:39.509005 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:39.509136 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:39.509079 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:39.645640 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:39.645606 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" event={"ID":"35c9d122-d374-4664-b013-c9bdfd5b8759","Type":"ContainerStarted","Data":"2d4d178ecb803c7cbf35380c33f76c75435517fbaa119dc799f9f6aa616552b8"} Apr 23 16:35:39.646984 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:39.646927 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tft7d" event={"ID":"9c427934-9049-45a2-bbd8-6cb89f9149e2","Type":"ContainerStarted","Data":"35c9306a519a87c9f6a68767f40cd5547065f9288e568e647611ecab5265563a"} Apr 23 16:35:40.654128 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:40.654033 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" event={"ID":"9c4113f8-445a-41a3-afe0-4d920d77c9c9","Type":"ContainerStarted","Data":"b534b8bf080a2b696ac12192e6e0be56cf7eceb73d52c68369f49146fc117e90"} Apr 23 16:35:40.655641 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:40.655608 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" event={"ID":"35c9d122-d374-4664-b013-c9bdfd5b8759","Type":"ContainerStarted","Data":"109195571e95eeaee532b33b232e7a2913eb45f430f342d131d55f9516257532"} Apr 23 16:35:40.690914 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:40.690874 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tft7d" podStartSLOduration=6.422693696 podStartE2EDuration="23.690862169s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:35:20.0928637 +0000 UTC m=+3.174306252" lastFinishedPulling="2026-04-23 16:35:37.361032167 +0000 UTC m=+20.442474725" observedRunningTime="2026-04-23 16:35:39.70204353 +0000 UTC m=+22.783486110" watchObservedRunningTime="2026-04-23 16:35:40.690862169 +0000 UTC m=+23.772304738" Apr 23 16:35:40.691082 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:40.690941 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vv85t" podStartSLOduration=4.237231296 podStartE2EDuration="23.690937182s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:35:20.08626463 +0000 UTC m=+3.167707187" lastFinishedPulling="2026-04-23 16:35:39.539970511 +0000 UTC m=+22.621413073" observedRunningTime="2026-04-23 16:35:40.689505942 +0000 UTC m=+23.770948522" watchObservedRunningTime="2026-04-23 16:35:40.690937182 +0000 UTC m=+23.772379751" Apr 23 16:35:41.506691 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:41.506657 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:41.506890 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:41.506657 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:41.506890 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:41.506818 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:41.506993 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:41.506893 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:42.479386 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:42.479347 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:42.480504 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:42.480481 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:42.659746 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:42.659718 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:42.660385 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:42.660365 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jdgl4" Apr 23 16:35:43.507209 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:43.506994 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:43.507652 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:43.506994 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:43.507652 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:43.507309 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:43.507652 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:43.507386 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:43.665170 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:43.665137 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" event={"ID":"9c4113f8-445a-41a3-afe0-4d920d77c9c9","Type":"ContainerStarted","Data":"c0c8af14c3589aef6fe7e146047ac43ff0f6f999cebe85474efd8c6d02c6f2ea"} Apr 23 16:35:43.710212 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:43.710159 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" podStartSLOduration=9.080101359 podStartE2EDuration="26.710143937s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:35:20.08951888 +0000 UTC m=+3.170961429" lastFinishedPulling="2026-04-23 16:35:37.71956146 +0000 UTC m=+20.801004007" observedRunningTime="2026-04-23 16:35:43.70657271 +0000 UTC m=+26.788015279" watchObservedRunningTime="2026-04-23 16:35:43.710143937 +0000 UTC m=+26.791586504" Apr 23 16:35:44.669170 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:44.669128 2571 generic.go:358] "Generic (PLEG): container finished" podID="31d4c6c2-69cd-4240-b617-6cc884b17481" containerID="1ee1154630cde161340868b31b29b2f58c16c3bd347f52a400a4449130eb3807" exitCode=0 Apr 23 16:35:44.669606 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:44.669183 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" event={"ID":"31d4c6c2-69cd-4240-b617-6cc884b17481","Type":"ContainerDied","Data":"1ee1154630cde161340868b31b29b2f58c16c3bd347f52a400a4449130eb3807"} Apr 23 16:35:44.670050 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:44.669987 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:44.670050 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:44.670019 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:44.670050 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:44.670032 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:44.692569 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:44.692541 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:44.692681 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:44.692618 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:35:45.423533 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:45.423305 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vrntx"] Apr 23 16:35:45.423674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:45.423643 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:45.423789 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:45.423763 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:45.427054 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:45.426800 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hsxbc"] Apr 23 16:35:45.427054 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:45.426918 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:45.427054 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:45.427019 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:45.672618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:45.672583 2571 generic.go:358] "Generic (PLEG): container finished" podID="31d4c6c2-69cd-4240-b617-6cc884b17481" containerID="674ddcf648ccd39277e3b76c0d715dafcb8dc135d9f7118a2a1014f320b9e6f5" exitCode=0 Apr 23 16:35:45.673102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:45.672665 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" event={"ID":"31d4c6c2-69cd-4240-b617-6cc884b17481","Type":"ContainerDied","Data":"674ddcf648ccd39277e3b76c0d715dafcb8dc135d9f7118a2a1014f320b9e6f5"} Apr 23 16:35:46.675977 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:46.675944 2571 generic.go:358] "Generic (PLEG): container finished" podID="31d4c6c2-69cd-4240-b617-6cc884b17481" containerID="f987f2d41a3c7e460ca7ba2ad68f87982bebf381e46838ad6955cd43442d6505" exitCode=0 Apr 23 16:35:46.676327 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:46.676040 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" event={"ID":"31d4c6c2-69cd-4240-b617-6cc884b17481","Type":"ContainerDied","Data":"f987f2d41a3c7e460ca7ba2ad68f87982bebf381e46838ad6955cd43442d6505"} Apr 23 16:35:47.508289 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:47.508261 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:47.508469 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:47.508358 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:47.508469 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:47.508379 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:47.508469 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:47.508444 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:49.509745 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:49.509715 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:49.510407 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:49.509715 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:49.510407 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:49.509820 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vrntx" podUID="7c952840-e4f8-4b49-9f90-0d7aa2618091" Apr 23 16:35:49.510407 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:49.509930 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:35:50.717015 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.716983 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-27.ec2.internal" event="NodeReady" Apr 23 16:35:50.717512 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.717154 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:35:50.814581 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.814535 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mfgns"] Apr 23 16:35:50.830615 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.830591 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bbhgp"] Apr 23 16:35:50.830776 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.830753 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:50.836665 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.836446 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:35:50.836799 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.836469 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vvvrx\"" Apr 23 16:35:50.836799 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.836474 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:35:50.842491 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.842472 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:35:50.845862 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.845839 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:35:50.845965 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.845845 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l9v47\"" Apr 23 16:35:50.846262 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.846243 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:35:50.846388 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.846246 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:35:50.848466 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.848445 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bbhgp"] Apr 23 16:35:50.855821 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.855803 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mfgns"] Apr 23 16:35:50.931230 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.931199 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:35:50.931230 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.931233 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xh24\" (UniqueName: \"kubernetes.io/projected/1026d702-16e3-45e1-821e-0f0a702f27d3-kube-api-access-4xh24\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:50.931410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.931273 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1026d702-16e3-45e1-821e-0f0a702f27d3-config-volume\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:50.931410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.931287 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:50.931410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.931309 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql2pg\" (UniqueName: \"kubernetes.io/projected/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-kube-api-access-ql2pg\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:35:50.931410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:50.931329 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1026d702-16e3-45e1-821e-0f0a702f27d3-tmp-dir\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:51.032086 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.031995 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1026d702-16e3-45e1-821e-0f0a702f27d3-config-volume\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:51.032086 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.032038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:51.032319 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.032131 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:51.032319 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.032167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ql2pg\" (UniqueName: \"kubernetes.io/projected/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-kube-api-access-ql2pg\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:35:51.032319 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.032195 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls podName:1026d702-16e3-45e1-821e-0f0a702f27d3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:51.532172069 +0000 UTC m=+34.613614650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls") pod "dns-default-mfgns" (UID: "1026d702-16e3-45e1-821e-0f0a702f27d3") : secret "dns-default-metrics-tls" not found Apr 23 16:35:51.032319 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.032231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1026d702-16e3-45e1-821e-0f0a702f27d3-tmp-dir\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:51.032319 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.032303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:35:51.032602 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.032327 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xh24\" (UniqueName: \"kubernetes.io/projected/1026d702-16e3-45e1-821e-0f0a702f27d3-kube-api-access-4xh24\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:51.032602 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.032463 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:51.032602 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.032513 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert podName:ec4a3bc4-5872-47af-b0d2-34143e0f2dea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:51.532497264 +0000 UTC m=+34.613939813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert") pod "ingress-canary-bbhgp" (UID: "ec4a3bc4-5872-47af-b0d2-34143e0f2dea") : secret "canary-serving-cert" not found Apr 23 16:35:51.033093 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.033070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1026d702-16e3-45e1-821e-0f0a702f27d3-tmp-dir\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:51.033231 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.033214 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1026d702-16e3-45e1-821e-0f0a702f27d3-config-volume\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:51.056935 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.056911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xh24\" (UniqueName: \"kubernetes.io/projected/1026d702-16e3-45e1-821e-0f0a702f27d3-kube-api-access-4xh24\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:51.058166 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.058143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql2pg\" (UniqueName: \"kubernetes.io/projected/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-kube-api-access-ql2pg\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:35:51.133514 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.133475 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:51.133674 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.133643 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:51.133759 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.133725 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs podName:9f25a094-e342-4690-8028-f1a3ddd77829 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:23.133710277 +0000 UTC m=+66.215152824 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs") pod "network-metrics-daemon-hsxbc" (UID: "9f25a094-e342-4690-8028-f1a3ddd77829") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:51.234482 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.234441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqjv\" (UniqueName: \"kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv\") pod \"network-check-target-vrntx\" (UID: \"7c952840-e4f8-4b49-9f90-0d7aa2618091\") " pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:51.234643 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.234568 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:51.234643 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.234590 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:51.234643 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.234601 2571 projected.go:194] Error preparing data for projected volume kube-api-access-thqjv for pod openshift-network-diagnostics/network-check-target-vrntx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:51.234782 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.234661 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv podName:7c952840-e4f8-4b49-9f90-0d7aa2618091 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:23.234648729 +0000 UTC m=+66.316091276 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqjv" (UniqueName: "kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv") pod "network-check-target-vrntx" (UID: "7c952840-e4f8-4b49-9f90-0d7aa2618091") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:51.510293 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.509928 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:35:51.510293 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.509937 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:35:51.513113 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.513086 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tpj7t\"" Apr 23 16:35:51.513269 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.513248 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:51.514333 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.514310 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nldpm\"" Apr 23 16:35:51.514333 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.514326 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:35:51.514531 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.514484 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:35:51.536736 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.536715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:35:51.536836 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:51.536765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:51.536894 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.536872 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:51.536894 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.536872 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:51.536968 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.536929 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls podName:1026d702-16e3-45e1-821e-0f0a702f27d3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:52.536912828 +0000 UTC m=+35.618355376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls") pod "dns-default-mfgns" (UID: "1026d702-16e3-45e1-821e-0f0a702f27d3") : secret "dns-default-metrics-tls" not found Apr 23 16:35:51.536968 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:51.536944 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert podName:ec4a3bc4-5872-47af-b0d2-34143e0f2dea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:52.536936418 +0000 UTC m=+35.618378965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert") pod "ingress-canary-bbhgp" (UID: "ec4a3bc4-5872-47af-b0d2-34143e0f2dea") : secret "canary-serving-cert" not found Apr 23 16:35:52.543883 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:52.543849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:52.544257 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:52.543911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:35:52.544257 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:52.543995 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:52.544257 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:52.543995 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:52.544257 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:52.544054 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert podName:ec4a3bc4-5872-47af-b0d2-34143e0f2dea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:54.544039334 +0000 UTC m=+37.625481880 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert") pod "ingress-canary-bbhgp" (UID: "ec4a3bc4-5872-47af-b0d2-34143e0f2dea") : secret "canary-serving-cert" not found Apr 23 16:35:52.544257 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:52.544071 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls podName:1026d702-16e3-45e1-821e-0f0a702f27d3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:54.544062856 +0000 UTC m=+37.625505403 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls") pod "dns-default-mfgns" (UID: "1026d702-16e3-45e1-821e-0f0a702f27d3") : secret "dns-default-metrics-tls" not found Apr 23 16:35:53.691868 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:53.691837 2571 generic.go:358] "Generic (PLEG): container finished" podID="31d4c6c2-69cd-4240-b617-6cc884b17481" containerID="f52f44e9bb9cad2c8db47c1f4e3c71bff477c1434189fc8e448a8c6318a8e851" exitCode=0 Apr 23 16:35:53.692304 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:53.691879 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" event={"ID":"31d4c6c2-69cd-4240-b617-6cc884b17481","Type":"ContainerDied","Data":"f52f44e9bb9cad2c8db47c1f4e3c71bff477c1434189fc8e448a8c6318a8e851"} Apr 23 16:35:54.556637 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:54.556600 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:35:54.556807 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:54.556656 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:54.556807 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:54.556762 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:54.556807 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:54.556765 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:54.556904 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:54.556813 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls podName:1026d702-16e3-45e1-821e-0f0a702f27d3 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:58.556800401 +0000 UTC m=+41.638242949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls") pod "dns-default-mfgns" (UID: "1026d702-16e3-45e1-821e-0f0a702f27d3") : secret "dns-default-metrics-tls" not found Apr 23 16:35:54.556904 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:54.556828 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert podName:ec4a3bc4-5872-47af-b0d2-34143e0f2dea nodeName:}" failed. No retries permitted until 2026-04-23 16:35:58.556821862 +0000 UTC m=+41.638264409 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert") pod "ingress-canary-bbhgp" (UID: "ec4a3bc4-5872-47af-b0d2-34143e0f2dea") : secret "canary-serving-cert" not found Apr 23 16:35:54.696233 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:54.696203 2571 generic.go:358] "Generic (PLEG): container finished" podID="31d4c6c2-69cd-4240-b617-6cc884b17481" containerID="dd06aa2fcd04b64694500c94b6cef0548d1dec53f4b32aeaa186762f59e0fa2a" exitCode=0 Apr 23 16:35:54.696564 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:54.696251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" event={"ID":"31d4c6c2-69cd-4240-b617-6cc884b17481","Type":"ContainerDied","Data":"dd06aa2fcd04b64694500c94b6cef0548d1dec53f4b32aeaa186762f59e0fa2a"} Apr 23 16:35:55.700707 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:55.700662 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" event={"ID":"31d4c6c2-69cd-4240-b617-6cc884b17481","Type":"ContainerStarted","Data":"44a89df36ddf82c59fe5cb1711548d085264c1a8c276ddc31ea79e1e604ceaf2"} Apr 23 16:35:55.728800 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:55.728751 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q7tsv" podStartSLOduration=6.210645127 podStartE2EDuration="38.72873814s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:35:20.096266654 +0000 UTC m=+3.177709216" lastFinishedPulling="2026-04-23 16:35:52.614359667 +0000 UTC m=+35.695802229" observedRunningTime="2026-04-23 16:35:55.726989285 +0000 UTC m=+38.808431853" watchObservedRunningTime="2026-04-23 16:35:55.72873814 +0000 UTC m=+38.810180709" Apr 23 16:35:58.587534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:58.587499 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:35:58.587947 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:35:58.587553 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:35:58.587947 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:58.587642 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:58.587947 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:58.587649 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:58.587947 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:58.587712 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls podName:1026d702-16e3-45e1-821e-0f0a702f27d3 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:06.587678153 +0000 UTC m=+49.669120699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls") pod "dns-default-mfgns" (UID: "1026d702-16e3-45e1-821e-0f0a702f27d3") : secret "dns-default-metrics-tls" not found Apr 23 16:35:58.587947 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:35:58.587728 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert podName:ec4a3bc4-5872-47af-b0d2-34143e0f2dea nodeName:}" failed. No retries permitted until 2026-04-23 16:36:06.587720527 +0000 UTC m=+49.669163074 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert") pod "ingress-canary-bbhgp" (UID: "ec4a3bc4-5872-47af-b0d2-34143e0f2dea") : secret "canary-serving-cert" not found Apr 23 16:36:06.636198 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:06.636157 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:36:06.636720 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:06.636211 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:36:06.636720 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:06.636310 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:06.636720 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:06.636316 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:06.636720 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:06.636362 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls podName:1026d702-16e3-45e1-821e-0f0a702f27d3 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:22.636347802 +0000 UTC m=+65.717790349 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls") pod "dns-default-mfgns" (UID: "1026d702-16e3-45e1-821e-0f0a702f27d3") : secret "dns-default-metrics-tls" not found Apr 23 16:36:06.636720 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:06.636393 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert podName:ec4a3bc4-5872-47af-b0d2-34143e0f2dea nodeName:}" failed. No retries permitted until 2026-04-23 16:36:22.636374619 +0000 UTC m=+65.717817186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert") pod "ingress-canary-bbhgp" (UID: "ec4a3bc4-5872-47af-b0d2-34143e0f2dea") : secret "canary-serving-cert" not found Apr 23 16:36:16.687035 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:16.686986 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jbv88" Apr 23 16:36:22.645659 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:22.645616 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:36:22.646056 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:22.645676 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:36:22.646056 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:22.645784 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:22.646056 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:22.645786 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:22.646056 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:22.645843 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls podName:1026d702-16e3-45e1-821e-0f0a702f27d3 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:54.645830573 +0000 UTC m=+97.727273119 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls") pod "dns-default-mfgns" (UID: "1026d702-16e3-45e1-821e-0f0a702f27d3") : secret "dns-default-metrics-tls" not found Apr 23 16:36:22.646056 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:22.645857 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert podName:ec4a3bc4-5872-47af-b0d2-34143e0f2dea nodeName:}" failed. No retries permitted until 2026-04-23 16:36:54.645851515 +0000 UTC m=+97.727294062 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert") pod "ingress-canary-bbhgp" (UID: "ec4a3bc4-5872-47af-b0d2-34143e0f2dea") : secret "canary-serving-cert" not found Apr 23 16:36:23.149046 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:23.149003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:36:23.152097 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:23.152078 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:36:23.159763 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:23.159743 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:36:23.159831 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:23.159822 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs podName:9f25a094-e342-4690-8028-f1a3ddd77829 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:27.159805427 +0000 UTC m=+130.241247974 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs") pod "network-metrics-daemon-hsxbc" (UID: "9f25a094-e342-4690-8028-f1a3ddd77829") : secret "metrics-daemon-secret" not found Apr 23 16:36:23.249545 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:23.249509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqjv\" (UniqueName: \"kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv\") pod \"network-check-target-vrntx\" (UID: \"7c952840-e4f8-4b49-9f90-0d7aa2618091\") " pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:36:23.252445 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:23.252425 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:36:23.263168 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:23.263151 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:36:23.274632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:23.274610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thqjv\" (UniqueName: \"kubernetes.io/projected/7c952840-e4f8-4b49-9f90-0d7aa2618091-kube-api-access-thqjv\") pod \"network-check-target-vrntx\" (UID: \"7c952840-e4f8-4b49-9f90-0d7aa2618091\") " pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:36:23.324222 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:23.324193 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tpj7t\"" Apr 23 16:36:23.332002 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:23.331980 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:36:23.494602 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:23.494495 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vrntx"] Apr 23 16:36:23.497183 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:36:23.497153 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c952840_e4f8_4b49_9f90_0d7aa2618091.slice/crio-637866fd4b803bea459074aed82b6ec41731d7f676a600cde28a04f653e5b76f WatchSource:0}: Error finding container 637866fd4b803bea459074aed82b6ec41731d7f676a600cde28a04f653e5b76f: Status 404 returned error can't find the container with id 637866fd4b803bea459074aed82b6ec41731d7f676a600cde28a04f653e5b76f Apr 23 16:36:23.757155 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:23.757121 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vrntx" event={"ID":"7c952840-e4f8-4b49-9f90-0d7aa2618091","Type":"ContainerStarted","Data":"637866fd4b803bea459074aed82b6ec41731d7f676a600cde28a04f653e5b76f"} Apr 23 16:36:26.764508 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:26.764474 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vrntx" event={"ID":"7c952840-e4f8-4b49-9f90-0d7aa2618091","Type":"ContainerStarted","Data":"ae9629714414ac04a990db556b902e81f18a9d1768cbb8d5e57069e4830b6cc3"} Apr 23 16:36:26.764887 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:26.764613 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:36:26.781875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:26.781820 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vrntx" podStartSLOduration=67.079848958 podStartE2EDuration="1m9.781804468s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:36:23.498951618 +0000 UTC m=+66.580394178" lastFinishedPulling="2026-04-23 16:36:26.20090714 +0000 UTC m=+69.282349688" observedRunningTime="2026-04-23 16:36:26.781358431 +0000 UTC m=+69.862801000" watchObservedRunningTime="2026-04-23 16:36:26.781804468 +0000 UTC m=+69.863247034" Apr 23 16:36:54.667982 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:54.667937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:36:54.668435 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:54.667996 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:36:54.668435 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:54.668079 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:54.668435 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:54.668104 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:54.668435 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:54.668169 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls podName:1026d702-16e3-45e1-821e-0f0a702f27d3 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:58.668152844 +0000 UTC m=+161.749595396 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls") pod "dns-default-mfgns" (UID: "1026d702-16e3-45e1-821e-0f0a702f27d3") : secret "dns-default-metrics-tls" not found Apr 23 16:36:54.668435 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:36:54.668184 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert podName:ec4a3bc4-5872-47af-b0d2-34143e0f2dea nodeName:}" failed. No retries permitted until 2026-04-23 16:37:58.66817703 +0000 UTC m=+161.749619576 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert") pod "ingress-canary-bbhgp" (UID: "ec4a3bc4-5872-47af-b0d2-34143e0f2dea") : secret "canary-serving-cert" not found Apr 23 16:36:57.768819 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:36:57.768790 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vrntx" Apr 23 16:37:24.495309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.495274 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5b5c5f8b9d-dcw7k"] Apr 23 16:37:24.497731 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.497715 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.501180 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.501155 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 16:37:24.501180 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.501155 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 16:37:24.501409 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.501393 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 16:37:24.501542 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.501526 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 16:37:24.501618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.501562 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 16:37:24.502334 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.502319 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 16:37:24.502434 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.502416 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-wbq2m\"" Apr 23 16:37:24.512520 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.512497 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5b5c5f8b9d-dcw7k"] Apr 23 16:37:24.582297 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.582260 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.582445 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.582334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrcqp\" (UniqueName: \"kubernetes.io/projected/9a0573ec-1640-4a5f-a88d-42fbaeb67495-kube-api-access-vrcqp\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.582445 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.582366 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-stats-auth\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.582445 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.582402 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.582445 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.582422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-default-certificate\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.629072 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.629032 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm"] Apr 23 16:37:24.632424 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.632401 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm" Apr 23 16:37:24.633079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.633060 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g"] Apr 23 16:37:24.635298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.635279 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:24.635414 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.635302 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:24.635414 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.635364 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 16:37:24.639194 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.639172 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:37:24.639281 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.639202 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 16:37:24.639281 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.639261 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-9l7w4\"" Apr 23 16:37:24.639401 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.639291 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 16:37:24.639401 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.639303 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:37:24.639401 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.639357 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6lmgc\"" Apr 23 16:37:24.655545 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.655526 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d97rf"] Apr 23 16:37:24.658062 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.658044 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm"] Apr 23 16:37:24.658139 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.658066 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g"] Apr 23 16:37:24.658186 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.658148 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.664948 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.664931 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 16:37:24.665113 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.665097 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:37:24.665864 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.665844 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 16:37:24.666437 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.666420 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pqbrs\"" Apr 23 16:37:24.667295 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.667279 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:37:24.681086 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.681061 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d97rf"] Apr 23 16:37:24.682911 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.682883 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.683035 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.682920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-default-certificate\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.683035 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.682947 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.683035 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.683019 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrcqp\" (UniqueName: \"kubernetes.io/projected/9a0573ec-1640-4a5f-a88d-42fbaeb67495-kube-api-access-vrcqp\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.683187 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.683058 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-stats-auth\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.683731 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:24.683591 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:24.683731 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:24.683659 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:25.1836386 +0000 UTC m=+128.265081148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : secret "router-metrics-certs-default" not found Apr 23 16:37:24.683892 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.683595 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 16:37:24.683947 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:24.683896 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:25.183859853 +0000 UTC m=+128.265302413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:24.685739 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.685679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-stats-auth\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.685867 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.685851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-default-certificate\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.700296 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.700275 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrcqp\" (UniqueName: \"kubernetes.io/projected/9a0573ec-1640-4a5f-a88d-42fbaeb67495-kube-api-access-vrcqp\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:24.784319 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.784226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b909c180-8bf1-44a4-8a87-c6d4756c787a-snapshots\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.784319 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.784281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq5lt\" (UniqueName: \"kubernetes.io/projected/4984906f-2911-43eb-982b-401a7c9fbc32-kube-api-access-cq5lt\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:24.784481 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.784329 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b909c180-8bf1-44a4-8a87-c6d4756c787a-tmp\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.784481 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.784389 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b909c180-8bf1-44a4-8a87-c6d4756c787a-serving-cert\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.784481 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.784420 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:24.784576 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.784484 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4984906f-2911-43eb-982b-401a7c9fbc32-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:24.784576 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.784527 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5znk\" (UniqueName: \"kubernetes.io/projected/d2276480-4d87-4114-b668-458483b817d5-kube-api-access-h5znk\") pod \"volume-data-source-validator-7c6cbb6c87-v4lvm\" (UID: \"d2276480-4d87-4114-b668-458483b817d5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm" Apr 23 16:37:24.784576 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.784544 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b909c180-8bf1-44a4-8a87-c6d4756c787a-service-ca-bundle\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.784576 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.784559 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b909c180-8bf1-44a4-8a87-c6d4756c787a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.784576 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.784575 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txx5c\" (UniqueName: \"kubernetes.io/projected/b909c180-8bf1-44a4-8a87-c6d4756c787a-kube-api-access-txx5c\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.885176 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885148 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4984906f-2911-43eb-982b-401a7c9fbc32-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:24.885328 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885187 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5znk\" (UniqueName: \"kubernetes.io/projected/d2276480-4d87-4114-b668-458483b817d5-kube-api-access-h5znk\") pod \"volume-data-source-validator-7c6cbb6c87-v4lvm\" (UID: \"d2276480-4d87-4114-b668-458483b817d5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm" Apr 23 16:37:24.885328 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885204 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b909c180-8bf1-44a4-8a87-c6d4756c787a-service-ca-bundle\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.885328 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b909c180-8bf1-44a4-8a87-c6d4756c787a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.885328 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txx5c\" (UniqueName: \"kubernetes.io/projected/b909c180-8bf1-44a4-8a87-c6d4756c787a-kube-api-access-txx5c\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.885539 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b909c180-8bf1-44a4-8a87-c6d4756c787a-snapshots\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.885539 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885475 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq5lt\" (UniqueName: \"kubernetes.io/projected/4984906f-2911-43eb-982b-401a7c9fbc32-kube-api-access-cq5lt\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:24.885539 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b909c180-8bf1-44a4-8a87-c6d4756c787a-tmp\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.885684 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885539 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b909c180-8bf1-44a4-8a87-c6d4756c787a-serving-cert\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.885684 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885579 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:24.885829 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:24.885730 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:24.885829 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:24.885805 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls podName:4984906f-2911-43eb-982b-401a7c9fbc32 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:25.385786022 +0000 UTC m=+128.467228576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmq6g" (UID: "4984906f-2911-43eb-982b-401a7c9fbc32") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:24.885964 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b909c180-8bf1-44a4-8a87-c6d4756c787a-service-ca-bundle\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.886021 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.885986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b909c180-8bf1-44a4-8a87-c6d4756c787a-tmp\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.886072 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.886053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b909c180-8bf1-44a4-8a87-c6d4756c787a-snapshots\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.886122 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.886090 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4984906f-2911-43eb-982b-401a7c9fbc32-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:24.886417 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.886399 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b909c180-8bf1-44a4-8a87-c6d4756c787a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.888219 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.888198 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b909c180-8bf1-44a4-8a87-c6d4756c787a-serving-cert\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.895120 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.895100 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txx5c\" (UniqueName: \"kubernetes.io/projected/b909c180-8bf1-44a4-8a87-c6d4756c787a-kube-api-access-txx5c\") pod \"insights-operator-585dfdc468-d97rf\" (UID: \"b909c180-8bf1-44a4-8a87-c6d4756c787a\") " pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:24.895458 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.895433 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq5lt\" (UniqueName: \"kubernetes.io/projected/4984906f-2911-43eb-982b-401a7c9fbc32-kube-api-access-cq5lt\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:24.896300 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.896273 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5znk\" (UniqueName: \"kubernetes.io/projected/d2276480-4d87-4114-b668-458483b817d5-kube-api-access-h5znk\") pod \"volume-data-source-validator-7c6cbb6c87-v4lvm\" (UID: \"d2276480-4d87-4114-b668-458483b817d5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm" Apr 23 16:37:24.942209 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.942172 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm" Apr 23 16:37:24.966956 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:24.966930 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-d97rf" Apr 23 16:37:25.061678 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:25.061650 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm"] Apr 23 16:37:25.064605 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:37:25.064571 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2276480_4d87_4114_b668_458483b817d5.slice/crio-02adce17aca3b84ec42ac45164cd716f581caf0cb4f1e03d0f7051cd99fa0662 WatchSource:0}: Error finding container 02adce17aca3b84ec42ac45164cd716f581caf0cb4f1e03d0f7051cd99fa0662: Status 404 returned error can't find the container with id 02adce17aca3b84ec42ac45164cd716f581caf0cb4f1e03d0f7051cd99fa0662 Apr 23 16:37:25.086025 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:25.086001 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d97rf"] Apr 23 16:37:25.088254 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:37:25.088217 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb909c180_8bf1_44a4_8a87_c6d4756c787a.slice/crio-6f1c29a0b9e769fa180ae2e8ff456ddf5aeaac3e975ecd4f410523ad7b123e06 WatchSource:0}: Error finding container 6f1c29a0b9e769fa180ae2e8ff456ddf5aeaac3e975ecd4f410523ad7b123e06: Status 404 returned error can't find the container with id 6f1c29a0b9e769fa180ae2e8ff456ddf5aeaac3e975ecd4f410523ad7b123e06 Apr 23 16:37:25.187936 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:25.187911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:25.188043 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:25.187942 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:25.188094 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:25.188059 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:26.188044375 +0000 UTC m=+129.269486922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:25.188094 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:25.188065 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:25.188167 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:25.188119 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:26.18810158 +0000 UTC m=+129.269544144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : secret "router-metrics-certs-default" not found Apr 23 16:37:25.389846 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:25.389767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:25.389976 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:25.389870 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:25.389976 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:25.389925 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls podName:4984906f-2911-43eb-982b-401a7c9fbc32 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:26.389910976 +0000 UTC m=+129.471353523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmq6g" (UID: "4984906f-2911-43eb-982b-401a7c9fbc32") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:25.872168 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:25.872127 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d97rf" event={"ID":"b909c180-8bf1-44a4-8a87-c6d4756c787a","Type":"ContainerStarted","Data":"6f1c29a0b9e769fa180ae2e8ff456ddf5aeaac3e975ecd4f410523ad7b123e06"} Apr 23 16:37:25.873308 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:25.873276 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm" event={"ID":"d2276480-4d87-4114-b668-458483b817d5","Type":"ContainerStarted","Data":"02adce17aca3b84ec42ac45164cd716f581caf0cb4f1e03d0f7051cd99fa0662"} Apr 23 16:37:26.196410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:26.196361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:26.196572 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:26.196424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:26.196572 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:26.196511 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:26.196689 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:26.196588 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:28.196569715 +0000 UTC m=+131.278012269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:26.196689 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:26.196610 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:28.19659946 +0000 UTC m=+131.278042013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : secret "router-metrics-certs-default" not found Apr 23 16:37:26.398384 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:26.398354 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:26.398550 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:26.398529 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:26.398620 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:26.398605 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls podName:4984906f-2911-43eb-982b-401a7c9fbc32 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:28.398586053 +0000 UTC m=+131.480028613 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmq6g" (UID: "4984906f-2911-43eb-982b-401a7c9fbc32") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:26.876584 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:26.876506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm" event={"ID":"d2276480-4d87-4114-b668-458483b817d5","Type":"ContainerStarted","Data":"8c413e1217d632d0049470fd377eb76e385657b481847ac35ebf183aab0a6610"} Apr 23 16:37:26.894485 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:26.894125 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-v4lvm" podStartSLOduration=1.570144074 podStartE2EDuration="2.894104571s" podCreationTimestamp="2026-04-23 16:37:24 +0000 UTC" firstStartedPulling="2026-04-23 16:37:25.066341629 +0000 UTC m=+128.147784383" lastFinishedPulling="2026-04-23 16:37:26.390302319 +0000 UTC m=+129.471744880" observedRunningTime="2026-04-23 16:37:26.893416131 +0000 UTC m=+129.974858700" watchObservedRunningTime="2026-04-23 16:37:26.894104571 +0000 UTC m=+129.975547146" Apr 23 16:37:27.205163 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:27.205128 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:37:27.205305 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:27.205278 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:37:27.205361 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:27.205349 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs podName:9f25a094-e342-4690-8028-f1a3ddd77829 nodeName:}" failed. No retries permitted until 2026-04-23 16:39:29.205333581 +0000 UTC m=+252.286776128 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs") pod "network-metrics-daemon-hsxbc" (UID: "9f25a094-e342-4690-8028-f1a3ddd77829") : secret "metrics-daemon-secret" not found Apr 23 16:37:27.879298 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:27.879264 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d97rf" event={"ID":"b909c180-8bf1-44a4-8a87-c6d4756c787a","Type":"ContainerStarted","Data":"4fb4a87e1c0130a9d892556d3b87e758d813a5824e6b5fec5ec167f120c99ae3"} Apr 23 16:37:27.897103 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:27.897060 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-d97rf" podStartSLOduration=2.007234581 podStartE2EDuration="3.897045776s" podCreationTimestamp="2026-04-23 16:37:24 +0000 UTC" firstStartedPulling="2026-04-23 16:37:25.090140732 +0000 UTC m=+128.171583282" lastFinishedPulling="2026-04-23 16:37:26.979951926 +0000 UTC m=+130.061394477" observedRunningTime="2026-04-23 16:37:27.89650759 +0000 UTC m=+130.977950159" watchObservedRunningTime="2026-04-23 16:37:27.897045776 +0000 UTC m=+130.978488345" Apr 23 16:37:28.212366 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:28.212329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:28.212498 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:28.212383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:28.212498 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:28.212464 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:28.212577 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:28.212540 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:32.212524516 +0000 UTC m=+135.293967067 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:28.212577 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:28.212567 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:32.212558682 +0000 UTC m=+135.294001229 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : secret "router-metrics-certs-default" not found Apr 23 16:37:28.413642 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:28.413606 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:28.413806 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:28.413770 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:28.413845 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:28.413833 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls podName:4984906f-2911-43eb-982b-401a7c9fbc32 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:32.413818387 +0000 UTC m=+135.495260934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmq6g" (UID: "4984906f-2911-43eb-982b-401a7c9fbc32") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:30.589981 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:30.589952 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6stz5_33e0f7f2-93d8-459f-9c61-240a8cdad803/dns-node-resolver/0.log" Apr 23 16:37:31.390401 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:31.390371 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d2k5z_7174271c-a85b-4c6d-872b-f2b384da443b/node-ca/0.log" Apr 23 16:37:32.240534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:32.240508 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:32.240913 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:32.240539 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:32.240913 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:32.240656 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:32.240913 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:32.240691 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:40.240675294 +0000 UTC m=+143.322117844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:32.240913 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:32.240735 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:40.240719764 +0000 UTC m=+143.322162315 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : secret "router-metrics-certs-default" not found Apr 23 16:37:32.441544 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:32.441497 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:32.441729 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:32.441642 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:32.441729 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:32.441728 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls podName:4984906f-2911-43eb-982b-401a7c9fbc32 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:40.441690029 +0000 UTC m=+143.523132577 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmq6g" (UID: "4984906f-2911-43eb-982b-401a7c9fbc32") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:34.511769 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.511731 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8sjkd"] Apr 23 16:37:34.514412 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.514397 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.517457 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.517430 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 16:37:34.517580 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.517480 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-tmvxl\"" Apr 23 16:37:34.518514 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.518492 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 16:37:34.518597 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.518574 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 16:37:34.518989 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.518976 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:34.526921 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.524140 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 16:37:34.528273 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.528249 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8sjkd"] Apr 23 16:37:34.658689 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.658653 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/036b63b0-d570-44cc-b606-bb46f38e6753-config\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.658855 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.658722 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/036b63b0-d570-44cc-b606-bb46f38e6753-serving-cert\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.658855 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.658746 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/036b63b0-d570-44cc-b606-bb46f38e6753-trusted-ca\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.658855 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.658774 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k4sg\" (UniqueName: \"kubernetes.io/projected/036b63b0-d570-44cc-b606-bb46f38e6753-kube-api-access-2k4sg\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.668296 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.668273 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr"] Apr 23 16:37:34.671257 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.671241 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:34.675408 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.675385 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-krfls\"" Apr 23 16:37:34.675509 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.675482 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 16:37:34.675569 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.675512 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:34.676058 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.676038 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 16:37:34.706094 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.706069 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr"] Apr 23 16:37:34.760156 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.760129 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/036b63b0-d570-44cc-b606-bb46f38e6753-trusted-ca\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.760297 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.760172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k4sg\" (UniqueName: \"kubernetes.io/projected/036b63b0-d570-44cc-b606-bb46f38e6753-kube-api-access-2k4sg\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.760297 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.760231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/036b63b0-d570-44cc-b606-bb46f38e6753-config\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.760297 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.760262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/036b63b0-d570-44cc-b606-bb46f38e6753-serving-cert\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.760927 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.760906 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/036b63b0-d570-44cc-b606-bb46f38e6753-config\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.761005 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.760929 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/036b63b0-d570-44cc-b606-bb46f38e6753-trusted-ca\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.762604 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.762556 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/036b63b0-d570-44cc-b606-bb46f38e6753-serving-cert\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.769769 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.769744 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k4sg\" (UniqueName: \"kubernetes.io/projected/036b63b0-d570-44cc-b606-bb46f38e6753-kube-api-access-2k4sg\") pod \"console-operator-9d4b6777b-8sjkd\" (UID: \"036b63b0-d570-44cc-b606-bb46f38e6753\") " pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.822795 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.822760 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:34.860804 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.860774 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:34.860924 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.860842 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wk88\" (UniqueName: \"kubernetes.io/projected/85de9ac6-b9d1-4573-ac9f-a00c20771091-kube-api-access-8wk88\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:34.934013 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.933984 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8sjkd"] Apr 23 16:37:34.936793 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:37:34.936765 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod036b63b0_d570_44cc_b606_bb46f38e6753.slice/crio-7b97b0e184427f4cd8cc4318416f4d6347102f41f0fb2919b44ad504839fb52b WatchSource:0}: Error finding container 7b97b0e184427f4cd8cc4318416f4d6347102f41f0fb2919b44ad504839fb52b: Status 404 returned error can't find the container with id 7b97b0e184427f4cd8cc4318416f4d6347102f41f0fb2919b44ad504839fb52b Apr 23 16:37:34.961638 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.961618 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:34.961738 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.961679 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wk88\" (UniqueName: \"kubernetes.io/projected/85de9ac6-b9d1-4573-ac9f-a00c20771091-kube-api-access-8wk88\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:34.961784 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:34.961767 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:37:34.961832 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:34.961822 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls podName:85de9ac6-b9d1-4573-ac9f-a00c20771091 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:35.461807379 +0000 UTC m=+138.543249926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fshnr" (UID: "85de9ac6-b9d1-4573-ac9f-a00c20771091") : secret "samples-operator-tls" not found Apr 23 16:37:34.972089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:34.972068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wk88\" (UniqueName: \"kubernetes.io/projected/85de9ac6-b9d1-4573-ac9f-a00c20771091-kube-api-access-8wk88\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:35.466268 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:35.466229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:35.466488 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:35.466466 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:37:35.466573 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:35.466562 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls podName:85de9ac6-b9d1-4573-ac9f-a00c20771091 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:36.466541094 +0000 UTC m=+139.547983647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fshnr" (UID: "85de9ac6-b9d1-4573-ac9f-a00c20771091") : secret "samples-operator-tls" not found Apr 23 16:37:35.897357 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:35.897261 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" event={"ID":"036b63b0-d570-44cc-b606-bb46f38e6753","Type":"ContainerStarted","Data":"7b97b0e184427f4cd8cc4318416f4d6347102f41f0fb2919b44ad504839fb52b"} Apr 23 16:37:36.474819 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:36.474786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:36.474986 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:36.474962 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:37:36.475064 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:36.475051 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls podName:85de9ac6-b9d1-4573-ac9f-a00c20771091 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:38.475029907 +0000 UTC m=+141.556472455 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fshnr" (UID: "85de9ac6-b9d1-4573-ac9f-a00c20771091") : secret "samples-operator-tls" not found Apr 23 16:37:36.901126 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:36.901051 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/0.log" Apr 23 16:37:36.901126 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:36.901091 2571 generic.go:358] "Generic (PLEG): container finished" podID="036b63b0-d570-44cc-b606-bb46f38e6753" containerID="c30a30c89cf762a4b0465f7dfe2fe0fed88f1d70bd80837aaba86980aff20f85" exitCode=255 Apr 23 16:37:36.901556 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:36.901129 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" event={"ID":"036b63b0-d570-44cc-b606-bb46f38e6753","Type":"ContainerDied","Data":"c30a30c89cf762a4b0465f7dfe2fe0fed88f1d70bd80837aaba86980aff20f85"} Apr 23 16:37:36.901556 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:36.901350 2571 scope.go:117] "RemoveContainer" containerID="c30a30c89cf762a4b0465f7dfe2fe0fed88f1d70bd80837aaba86980aff20f85" Apr 23 16:37:37.904733 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:37.904689 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:37:37.905105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:37.905050 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/0.log" Apr 23 16:37:37.905105 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:37.905083 2571 generic.go:358] "Generic (PLEG): container finished" podID="036b63b0-d570-44cc-b606-bb46f38e6753" containerID="77d91fc06ecd1bc350feeceb439a991335cbd069617503ae86f55d0496d02993" exitCode=255 Apr 23 16:37:37.905180 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:37.905118 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" event={"ID":"036b63b0-d570-44cc-b606-bb46f38e6753","Type":"ContainerDied","Data":"77d91fc06ecd1bc350feeceb439a991335cbd069617503ae86f55d0496d02993"} Apr 23 16:37:37.905180 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:37.905165 2571 scope.go:117] "RemoveContainer" containerID="c30a30c89cf762a4b0465f7dfe2fe0fed88f1d70bd80837aaba86980aff20f85" Apr 23 16:37:37.905378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:37.905363 2571 scope.go:117] "RemoveContainer" containerID="77d91fc06ecd1bc350feeceb439a991335cbd069617503ae86f55d0496d02993" Apr 23 16:37:37.905554 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:37.905537 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8sjkd_openshift-console-operator(036b63b0-d570-44cc-b606-bb46f38e6753)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" podUID="036b63b0-d570-44cc-b606-bb46f38e6753" Apr 23 16:37:38.490992 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:38.490953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:38.491191 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:38.491113 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:37:38.491257 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:38.491208 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls podName:85de9ac6-b9d1-4573-ac9f-a00c20771091 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:42.491186743 +0000 UTC m=+145.572629291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fshnr" (UID: "85de9ac6-b9d1-4573-ac9f-a00c20771091") : secret "samples-operator-tls" not found Apr 23 16:37:38.908141 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:38.908054 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:37:38.908472 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:38.908364 2571 scope.go:117] "RemoveContainer" containerID="77d91fc06ecd1bc350feeceb439a991335cbd069617503ae86f55d0496d02993" Apr 23 16:37:38.908541 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:38.908524 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8sjkd_openshift-console-operator(036b63b0-d570-44cc-b606-bb46f38e6753)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" podUID="036b63b0-d570-44cc-b606-bb46f38e6753" Apr 23 16:37:39.085756 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.085726 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd"] Apr 23 16:37:39.088883 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.088868 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd" Apr 23 16:37:39.091618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.091593 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-9vd7h\"" Apr 23 16:37:39.091872 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.091854 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 16:37:39.091938 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.091917 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:39.094005 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.093985 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpc9m\" (UniqueName: \"kubernetes.io/projected/b89977fa-d779-4ac7-93ec-ff738b252b10-kube-api-access-tpc9m\") pod \"migrator-74bb7799d9-zndmd\" (UID: \"b89977fa-d779-4ac7-93ec-ff738b252b10\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd" Apr 23 16:37:39.097401 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.097379 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd"] Apr 23 16:37:39.194319 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.194290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc9m\" (UniqueName: \"kubernetes.io/projected/b89977fa-d779-4ac7-93ec-ff738b252b10-kube-api-access-tpc9m\") pod \"migrator-74bb7799d9-zndmd\" (UID: \"b89977fa-d779-4ac7-93ec-ff738b252b10\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd" Apr 23 16:37:39.202737 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.202709 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc9m\" (UniqueName: \"kubernetes.io/projected/b89977fa-d779-4ac7-93ec-ff738b252b10-kube-api-access-tpc9m\") pod \"migrator-74bb7799d9-zndmd\" (UID: \"b89977fa-d779-4ac7-93ec-ff738b252b10\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd" Apr 23 16:37:39.397937 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.397899 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd" Apr 23 16:37:39.512910 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.512884 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd"] Apr 23 16:37:39.515840 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:37:39.515813 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb89977fa_d779_4ac7_93ec_ff738b252b10.slice/crio-63bbaf51130e99d32eb10ee04ff9f5d03ad48c1fc85deb24a0fe162c50422fc7 WatchSource:0}: Error finding container 63bbaf51130e99d32eb10ee04ff9f5d03ad48c1fc85deb24a0fe162c50422fc7: Status 404 returned error can't find the container with id 63bbaf51130e99d32eb10ee04ff9f5d03ad48c1fc85deb24a0fe162c50422fc7 Apr 23 16:37:39.910895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:39.910822 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd" event={"ID":"b89977fa-d779-4ac7-93ec-ff738b252b10","Type":"ContainerStarted","Data":"63bbaf51130e99d32eb10ee04ff9f5d03ad48c1fc85deb24a0fe162c50422fc7"} Apr 23 16:37:40.303590 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:40.303556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:40.303590 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:40.303594 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:40.303797 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:40.303711 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 16:37:40.303797 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:40.303754 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:56.303740725 +0000 UTC m=+159.385183272 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : configmap references non-existent config key: service-ca.crt Apr 23 16:37:40.303797 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:40.303768 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs podName:9a0573ec-1640-4a5f-a88d-42fbaeb67495 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:56.303762321 +0000 UTC m=+159.385204868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs") pod "router-default-5b5c5f8b9d-dcw7k" (UID: "9a0573ec-1640-4a5f-a88d-42fbaeb67495") : secret "router-metrics-certs-default" not found Apr 23 16:37:40.505292 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:40.505256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:40.505471 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:40.505411 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:40.505541 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:40.505481 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls podName:4984906f-2911-43eb-982b-401a7c9fbc32 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:56.505465439 +0000 UTC m=+159.586907986 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmq6g" (UID: "4984906f-2911-43eb-982b-401a7c9fbc32") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:40.914894 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:40.914818 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd" event={"ID":"b89977fa-d779-4ac7-93ec-ff738b252b10","Type":"ContainerStarted","Data":"291533ce0e67e48103a28666b98204040e30665da0a6c454f4a9305368dbd4b2"} Apr 23 16:37:40.914894 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:40.914856 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd" event={"ID":"b89977fa-d779-4ac7-93ec-ff738b252b10","Type":"ContainerStarted","Data":"07f39a3c64c2aeeccac43c195a8d0ff93f071574d910b5f1ef068311e95f2dab"} Apr 23 16:37:40.933630 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:40.933591 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zndmd" podStartSLOduration=0.833508668 podStartE2EDuration="1.93357601s" podCreationTimestamp="2026-04-23 16:37:39 +0000 UTC" firstStartedPulling="2026-04-23 16:37:39.517598295 +0000 UTC m=+142.599040841" lastFinishedPulling="2026-04-23 16:37:40.617665629 +0000 UTC m=+143.699108183" observedRunningTime="2026-04-23 16:37:40.932496401 +0000 UTC m=+144.013938988" watchObservedRunningTime="2026-04-23 16:37:40.93357601 +0000 UTC m=+144.015018578" Apr 23 16:37:42.520861 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:42.520809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:42.521278 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:42.520950 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:37:42.521278 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:42.521017 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls podName:85de9ac6-b9d1-4573-ac9f-a00c20771091 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:50.521001495 +0000 UTC m=+153.602444041 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fshnr" (UID: "85de9ac6-b9d1-4573-ac9f-a00c20771091") : secret "samples-operator-tls" not found Apr 23 16:37:44.823386 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:44.823351 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:44.823386 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:44.823389 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:44.823801 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:44.823788 2571 scope.go:117] "RemoveContainer" containerID="77d91fc06ecd1bc350feeceb439a991335cbd069617503ae86f55d0496d02993" Apr 23 16:37:44.823968 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:44.823950 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8sjkd_openshift-console-operator(036b63b0-d570-44cc-b606-bb46f38e6753)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" podUID="036b63b0-d570-44cc-b606-bb46f38e6753" Apr 23 16:37:50.584296 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:50.584255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:50.586858 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:50.586833 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85de9ac6-b9d1-4573-ac9f-a00c20771091-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fshnr\" (UID: \"85de9ac6-b9d1-4573-ac9f-a00c20771091\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:50.880520 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:50.880414 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" Apr 23 16:37:50.996314 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:50.996286 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr"] Apr 23 16:37:51.942739 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:51.942677 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" event={"ID":"85de9ac6-b9d1-4573-ac9f-a00c20771091","Type":"ContainerStarted","Data":"39ab7f67ca6524726cb3dec1677f1425b52693b94acd69ba35972b4ccd42ea78"} Apr 23 16:37:52.946919 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:52.946885 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" event={"ID":"85de9ac6-b9d1-4573-ac9f-a00c20771091","Type":"ContainerStarted","Data":"6710ffcf8040428db3451f3d4ee4604bd33bc890ec1f2e5760331bfb3df71cbf"} Apr 23 16:37:52.946919 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:52.946920 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" event={"ID":"85de9ac6-b9d1-4573-ac9f-a00c20771091","Type":"ContainerStarted","Data":"8430056af8fd90a8b2c10e4a61b0a2d3a2c5a8edaed5e8b8055a9e38b11cd4e1"} Apr 23 16:37:52.965537 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:52.965495 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fshnr" podStartSLOduration=17.337065401 podStartE2EDuration="18.965482745s" podCreationTimestamp="2026-04-23 16:37:34 +0000 UTC" firstStartedPulling="2026-04-23 16:37:51.038154147 +0000 UTC m=+154.119596695" lastFinishedPulling="2026-04-23 16:37:52.666571493 +0000 UTC m=+155.748014039" observedRunningTime="2026-04-23 16:37:52.965114653 +0000 UTC m=+156.046557224" watchObservedRunningTime="2026-04-23 16:37:52.965482745 +0000 UTC m=+156.046925314" Apr 23 16:37:53.841483 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:53.841441 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mfgns" podUID="1026d702-16e3-45e1-821e-0f0a702f27d3" Apr 23 16:37:53.852735 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:53.852689 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bbhgp" podUID="ec4a3bc4-5872-47af-b0d2-34143e0f2dea" Apr 23 16:37:53.949291 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:53.949258 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:37:53.949670 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:53.949258 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mfgns" Apr 23 16:37:54.530317 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:37:54.530274 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-hsxbc" podUID="9f25a094-e342-4690-8028-f1a3ddd77829" Apr 23 16:37:56.332398 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.332352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:56.332804 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.332494 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:56.333093 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.333073 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a0573ec-1640-4a5f-a88d-42fbaeb67495-service-ca-bundle\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:56.334998 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.334971 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a0573ec-1640-4a5f-a88d-42fbaeb67495-metrics-certs\") pod \"router-default-5b5c5f8b9d-dcw7k\" (UID: \"9a0573ec-1640-4a5f-a88d-42fbaeb67495\") " pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:56.534013 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.533963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:56.536379 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.536358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4984906f-2911-43eb-982b-401a7c9fbc32-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmq6g\" (UID: \"4984906f-2911-43eb-982b-401a7c9fbc32\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:56.606378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.606313 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:56.717975 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.717955 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5b5c5f8b9d-dcw7k"] Apr 23 16:37:56.720283 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:37:56.720256 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0573ec_1640_4a5f_a88d_42fbaeb67495.slice/crio-c65506d31756404e9c9af1d2af8cc30c98b6e20b4d530923cb00a95327f61238 WatchSource:0}: Error finding container c65506d31756404e9c9af1d2af8cc30c98b6e20b4d530923cb00a95327f61238: Status 404 returned error can't find the container with id c65506d31756404e9c9af1d2af8cc30c98b6e20b4d530923cb00a95327f61238 Apr 23 16:37:56.746472 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.746436 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" Apr 23 16:37:56.869878 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.869811 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g"] Apr 23 16:37:56.872370 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:37:56.872344 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4984906f_2911_43eb_982b_401a7c9fbc32.slice/crio-da3ac489b7c69f50a06aeb32cba626357ff13f78c22dbfda0827e55b7ef9e2e4 WatchSource:0}: Error finding container da3ac489b7c69f50a06aeb32cba626357ff13f78c22dbfda0827e55b7ef9e2e4: Status 404 returned error can't find the container with id da3ac489b7c69f50a06aeb32cba626357ff13f78c22dbfda0827e55b7ef9e2e4 Apr 23 16:37:56.957902 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.957867 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" event={"ID":"9a0573ec-1640-4a5f-a88d-42fbaeb67495","Type":"ContainerStarted","Data":"5848737492ec6ecc6ddf311e03f613e4eac4a920dd8bb28b880a2251bbbe1ed2"} Apr 23 16:37:56.958078 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.957909 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" event={"ID":"9a0573ec-1640-4a5f-a88d-42fbaeb67495","Type":"ContainerStarted","Data":"c65506d31756404e9c9af1d2af8cc30c98b6e20b4d530923cb00a95327f61238"} Apr 23 16:37:56.958956 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:56.958930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" event={"ID":"4984906f-2911-43eb-982b-401a7c9fbc32","Type":"ContainerStarted","Data":"da3ac489b7c69f50a06aeb32cba626357ff13f78c22dbfda0827e55b7ef9e2e4"} Apr 23 16:37:57.528663 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:57.528612 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" podStartSLOduration=33.528596743 podStartE2EDuration="33.528596743s" podCreationTimestamp="2026-04-23 16:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:57.004365369 +0000 UTC m=+160.085807937" watchObservedRunningTime="2026-04-23 16:37:57.528596743 +0000 UTC m=+160.610039317" Apr 23 16:37:57.607182 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:57.607146 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:57.609716 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:57.609672 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:57.968140 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:57.968102 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:57.969589 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:57.969564 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5b5c5f8b9d-dcw7k" Apr 23 16:37:58.753109 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:58.753079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:37:58.753467 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:58.753121 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:37:58.755836 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:58.755817 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1026d702-16e3-45e1-821e-0f0a702f27d3-metrics-tls\") pod \"dns-default-mfgns\" (UID: \"1026d702-16e3-45e1-821e-0f0a702f27d3\") " pod="openshift-dns/dns-default-mfgns" Apr 23 16:37:58.755942 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:58.755922 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec4a3bc4-5872-47af-b0d2-34143e0f2dea-cert\") pod \"ingress-canary-bbhgp\" (UID: \"ec4a3bc4-5872-47af-b0d2-34143e0f2dea\") " pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:37:58.971200 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:58.971166 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" event={"ID":"4984906f-2911-43eb-982b-401a7c9fbc32","Type":"ContainerStarted","Data":"fac2114486e8c92b590774008e60fd0da01b302e8f8db4968cbaaf892c61da75"} Apr 23 16:37:58.992072 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:58.992023 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmq6g" podStartSLOduration=33.485045836 podStartE2EDuration="34.99200951s" podCreationTimestamp="2026-04-23 16:37:24 +0000 UTC" firstStartedPulling="2026-04-23 16:37:56.874192224 +0000 UTC m=+159.955634772" lastFinishedPulling="2026-04-23 16:37:58.3811559 +0000 UTC m=+161.462598446" observedRunningTime="2026-04-23 16:37:58.990492321 +0000 UTC m=+162.071934889" watchObservedRunningTime="2026-04-23 16:37:58.99200951 +0000 UTC m=+162.073452118" Apr 23 16:37:59.052878 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.052801 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vvvrx\"" Apr 23 16:37:59.052878 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.052833 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l9v47\"" Apr 23 16:37:59.060070 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.060046 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bbhgp" Apr 23 16:37:59.060183 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.060134 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mfgns" Apr 23 16:37:59.205875 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.205852 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mfgns"] Apr 23 16:37:59.207852 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:37:59.207819 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1026d702_16e3_45e1_821e_0f0a702f27d3.slice/crio-c915621ae92bc895e0ad8ee6957a435913e72618f74909ff31e3296badc99156 WatchSource:0}: Error finding container c915621ae92bc895e0ad8ee6957a435913e72618f74909ff31e3296badc99156: Status 404 returned error can't find the container with id c915621ae92bc895e0ad8ee6957a435913e72618f74909ff31e3296badc99156 Apr 23 16:37:59.223058 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.223034 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bbhgp"] Apr 23 16:37:59.226756 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:37:59.226732 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec4a3bc4_5872_47af_b0d2_34143e0f2dea.slice/crio-708f34e8b40820a427d2237b83e78793762f9dc15833c3b1aa5af04ce9c1488d WatchSource:0}: Error finding container 708f34e8b40820a427d2237b83e78793762f9dc15833c3b1aa5af04ce9c1488d: Status 404 returned error can't find the container with id 708f34e8b40820a427d2237b83e78793762f9dc15833c3b1aa5af04ce9c1488d Apr 23 16:37:59.506935 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.506908 2571 scope.go:117] "RemoveContainer" containerID="77d91fc06ecd1bc350feeceb439a991335cbd069617503ae86f55d0496d02993" Apr 23 16:37:59.975485 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.975445 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mfgns" event={"ID":"1026d702-16e3-45e1-821e-0f0a702f27d3","Type":"ContainerStarted","Data":"c915621ae92bc895e0ad8ee6957a435913e72618f74909ff31e3296badc99156"} Apr 23 16:37:59.977928 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.977901 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:37:59.978055 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.978002 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" event={"ID":"036b63b0-d570-44cc-b606-bb46f38e6753","Type":"ContainerStarted","Data":"ebc046aade112221c46ab77a4dfb08813f3e9918a178eb3603d5a76e62783bdc"} Apr 23 16:37:59.979353 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.978424 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:37:59.980468 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.980430 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bbhgp" event={"ID":"ec4a3bc4-5872-47af-b0d2-34143e0f2dea","Type":"ContainerStarted","Data":"708f34e8b40820a427d2237b83e78793762f9dc15833c3b1aa5af04ce9c1488d"} Apr 23 16:37:59.997074 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:37:59.997025 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" podStartSLOduration=24.401070635 podStartE2EDuration="25.997010277s" podCreationTimestamp="2026-04-23 16:37:34 +0000 UTC" firstStartedPulling="2026-04-23 16:37:34.938454331 +0000 UTC m=+138.019896877" lastFinishedPulling="2026-04-23 16:37:36.534393972 +0000 UTC m=+139.615836519" observedRunningTime="2026-04-23 16:37:59.996740953 +0000 UTC m=+163.078183524" watchObservedRunningTime="2026-04-23 16:37:59.997010277 +0000 UTC m=+163.078452847" Apr 23 16:38:00.687751 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.687720 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7fsct"] Apr 23 16:38:00.690883 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.690862 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" Apr 23 16:38:00.693507 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.693487 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 16:38:00.695161 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.694814 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-8vfm6\"" Apr 23 16:38:00.695161 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.694946 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 16:38:00.708991 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.708966 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7fsct"] Apr 23 16:38:00.767873 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.767841 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bb5d4845-68ad-41c5-bc80-0d399b962c20-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7fsct\" (UID: \"bb5d4845-68ad-41c5-bc80-0d399b962c20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" Apr 23 16:38:00.768027 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.767913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bb5d4845-68ad-41c5-bc80-0d399b962c20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7fsct\" (UID: \"bb5d4845-68ad-41c5-bc80-0d399b962c20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" Apr 23 16:38:00.789419 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.789386 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq"] Apr 23 16:38:00.792373 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.792353 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7fc469b84-sgvkz"] Apr 23 16:38:00.792553 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.792528 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq" Apr 23 16:38:00.795422 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.795400 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.797880 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.797729 2571 status_manager.go:895] "Failed to get status for pod" podUID="5c592392-1c6a-4812-86ed-e1ed2f002ce0" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq" err="pods \"network-check-source-8894fc9bd-6mngq\" is forbidden: User \"system:node:ip-10-0-136-27.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-network-diagnostics\": no relationship found between node 'ip-10-0-136-27.ec2.internal' and this object" Apr 23 16:38:00.797880 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:00.797761 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"trusted-ca\" is forbidden: User \"system:node:ip-10-0-136-27.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-136-27.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" type="*v1.ConfigMap" Apr 23 16:38:00.798039 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:00.797987 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"network-diagnostics-dockercfg-flnpv\" is forbidden: User \"system:node:ip-10-0-136-27.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-network-diagnostics\": no relationship found between node 'ip-10-0-136-27.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-flnpv\"" type="*v1.Secret" Apr 23 16:38:00.798039 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:00.798008 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"image-registry-private-configuration\" is forbidden: User \"system:node:ip-10-0-136-27.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-136-27.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" type="*v1.Secret" Apr 23 16:38:00.798722 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.798681 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-x7fs6\"" Apr 23 16:38:00.799215 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:00.799181 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"installation-pull-secrets\" is forbidden: User \"system:node:ip-10-0-136-27.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-136-27.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" type="*v1.Secret" Apr 23 16:38:00.799438 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:00.799414 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"image-registry-tls\" is forbidden: User \"system:node:ip-10-0-136-27.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-136-27.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" type="*v1.Secret" Apr 23 16:38:00.816748 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.816725 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq"] Apr 23 16:38:00.823469 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.823448 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fc469b84-sgvkz"] Apr 23 16:38:00.843623 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.843593 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-z2gnb"] Apr 23 16:38:00.847869 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.847847 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:00.850511 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.850488 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:38:00.850611 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.850519 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kd74h\"" Apr 23 16:38:00.851048 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.851031 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:38:00.861612 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.861589 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7fc469b84-sgvkz"] Apr 23 16:38:00.861858 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:00.861830 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-qknqr registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" podUID="f38d7102-b6ce-4938-8954-91d3894e8c7c" Apr 23 16:38:00.863263 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.863241 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-z2gnb"] Apr 23 16:38:00.868389 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868368 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-image-registry-private-configuration\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.868493 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868407 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvls7\" (UniqueName: \"kubernetes.io/projected/5c592392-1c6a-4812-86ed-e1ed2f002ce0-kube-api-access-cvls7\") pod \"network-check-source-8894fc9bd-6mngq\" (UID: \"5c592392-1c6a-4812-86ed-e1ed2f002ce0\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq" Apr 23 16:38:00.868493 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868439 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-trusted-ca\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.868607 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868489 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bb5d4845-68ad-41c5-bc80-0d399b962c20-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7fsct\" (UID: \"bb5d4845-68ad-41c5-bc80-0d399b962c20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" Apr 23 16:38:00.868607 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868579 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38d7102-b6ce-4938-8954-91d3894e8c7c-ca-trust-extracted\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.868731 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868638 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-tls\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.868731 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-installation-pull-secrets\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.868838 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bb5d4845-68ad-41c5-bc80-0d399b962c20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7fsct\" (UID: \"bb5d4845-68ad-41c5-bc80-0d399b962c20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" Apr 23 16:38:00.868838 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qknqr\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-kube-api-access-qknqr\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.868838 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868821 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-bound-sa-token\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.868982 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.868865 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-certificates\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.869342 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.869298 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bb5d4845-68ad-41c5-bc80-0d399b962c20-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7fsct\" (UID: \"bb5d4845-68ad-41c5-bc80-0d399b962c20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" Apr 23 16:38:00.871687 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.871666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bb5d4845-68ad-41c5-bc80-0d399b962c20-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7fsct\" (UID: \"bb5d4845-68ad-41c5-bc80-0d399b962c20\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" Apr 23 16:38:00.969421 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969385 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38d7102-b6ce-4938-8954-91d3894e8c7c-ca-trust-extracted\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.969594 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969435 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkc8v\" (UniqueName: \"kubernetes.io/projected/cf289e76-9f00-4d50-b667-8c9cee95e651-kube-api-access-dkc8v\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:00.969594 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-tls\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.969724 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-installation-pull-secrets\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.969724 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cf289e76-9f00-4d50-b667-8c9cee95e651-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:00.969818 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969727 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cf289e76-9f00-4d50-b667-8c9cee95e651-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:00.969818 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qknqr\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-kube-api-access-qknqr\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.969818 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969799 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-bound-sa-token\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.969984 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969835 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38d7102-b6ce-4938-8954-91d3894e8c7c-ca-trust-extracted\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.969984 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969854 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-certificates\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.969984 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969880 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cf289e76-9f00-4d50-b667-8c9cee95e651-crio-socket\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:00.969984 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf289e76-9f00-4d50-b667-8c9cee95e651-data-volume\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:00.969984 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-image-registry-private-configuration\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.970198 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.969991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvls7\" (UniqueName: \"kubernetes.io/projected/5c592392-1c6a-4812-86ed-e1ed2f002ce0-kube-api-access-cvls7\") pod \"network-check-source-8894fc9bd-6mngq\" (UID: \"5c592392-1c6a-4812-86ed-e1ed2f002ce0\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq" Apr 23 16:38:00.970198 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.970031 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-trusted-ca\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.971270 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.971235 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-certificates\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.978354 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.978314 2571 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-8sjkd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.11:8443/readyz\": context deadline exceeded" start-of-body= Apr 23 16:38:00.978678 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.978374 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" podUID="036b63b0-d570-44cc-b606-bb46f38e6753" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.11:8443/readyz\": context deadline exceeded" Apr 23 16:38:00.982265 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.982239 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-bound-sa-token\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.982372 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.982264 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvls7\" (UniqueName: \"kubernetes.io/projected/5c592392-1c6a-4812-86ed-e1ed2f002ce0-kube-api-access-cvls7\") pod \"network-check-source-8894fc9bd-6mngq\" (UID: \"5c592392-1c6a-4812-86ed-e1ed2f002ce0\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq" Apr 23 16:38:00.982455 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.982434 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qknqr\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-kube-api-access-qknqr\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.984112 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.984062 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:00.999569 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:00.999547 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:01.001933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.001913 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" Apr 23 16:38:01.070563 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.070532 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38d7102-b6ce-4938-8954-91d3894e8c7c-ca-trust-extracted\") pod \"f38d7102-b6ce-4938-8954-91d3894e8c7c\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " Apr 23 16:38:01.070754 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.070581 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-certificates\") pod \"f38d7102-b6ce-4938-8954-91d3894e8c7c\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " Apr 23 16:38:01.070754 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.070630 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-bound-sa-token\") pod \"f38d7102-b6ce-4938-8954-91d3894e8c7c\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " Apr 23 16:38:01.070754 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.070655 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qknqr\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-kube-api-access-qknqr\") pod \"f38d7102-b6ce-4938-8954-91d3894e8c7c\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " Apr 23 16:38:01.070933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.070818 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cf289e76-9f00-4d50-b667-8c9cee95e651-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.070933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.070787 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f38d7102-b6ce-4938-8954-91d3894e8c7c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f38d7102-b6ce-4938-8954-91d3894e8c7c" (UID: "f38d7102-b6ce-4938-8954-91d3894e8c7c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:38:01.070933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.070860 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cf289e76-9f00-4d50-b667-8c9cee95e651-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.070933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.070914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cf289e76-9f00-4d50-b667-8c9cee95e651-crio-socket\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.071114 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.070960 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f38d7102-b6ce-4938-8954-91d3894e8c7c" (UID: "f38d7102-b6ce-4938-8954-91d3894e8c7c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:01.071114 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.070971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf289e76-9f00-4d50-b667-8c9cee95e651-data-volume\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.071114 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.071039 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cf289e76-9f00-4d50-b667-8c9cee95e651-crio-socket\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.071114 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.071046 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkc8v\" (UniqueName: \"kubernetes.io/projected/cf289e76-9f00-4d50-b667-8c9cee95e651-kube-api-access-dkc8v\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.071296 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.071133 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38d7102-b6ce-4938-8954-91d3894e8c7c-ca-trust-extracted\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:38:01.071664 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.071447 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-certificates\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:38:01.071664 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.071473 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cf289e76-9f00-4d50-b667-8c9cee95e651-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.071664 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.071596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cf289e76-9f00-4d50-b667-8c9cee95e651-data-volume\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.073026 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.072997 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-kube-api-access-qknqr" (OuterVolumeSpecName: "kube-api-access-qknqr") pod "f38d7102-b6ce-4938-8954-91d3894e8c7c" (UID: "f38d7102-b6ce-4938-8954-91d3894e8c7c"). InnerVolumeSpecName "kube-api-access-qknqr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:01.073464 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.073438 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f38d7102-b6ce-4938-8954-91d3894e8c7c" (UID: "f38d7102-b6ce-4938-8954-91d3894e8c7c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:01.073597 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.073507 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cf289e76-9f00-4d50-b667-8c9cee95e651-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.087160 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.087135 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkc8v\" (UniqueName: \"kubernetes.io/projected/cf289e76-9f00-4d50-b667-8c9cee95e651-kube-api-access-dkc8v\") pod \"insights-runtime-extractor-z2gnb\" (UID: \"cf289e76-9f00-4d50-b667-8c9cee95e651\") " pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.160509 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.160482 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-z2gnb" Apr 23 16:38:01.172594 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.172533 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-bound-sa-token\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:38:01.172594 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.172565 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qknqr\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-kube-api-access-qknqr\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:38:01.220163 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.220062 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-8sjkd" Apr 23 16:38:01.318072 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.318044 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7fsct"] Apr 23 16:38:01.319418 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:38:01.319390 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb5d4845_68ad_41c5_bc80_0d399b962c20.slice/crio-db4735445723fef72c5c6d9f999f87eeb49d45be8d4712babf7691524b32cd27 WatchSource:0}: Error finding container db4735445723fef72c5c6d9f999f87eeb49d45be8d4712babf7691524b32cd27: Status 404 returned error can't find the container with id db4735445723fef72c5c6d9f999f87eeb49d45be8d4712babf7691524b32cd27 Apr 23 16:38:01.380860 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.380830 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-z2gnb"] Apr 23 16:38:01.885340 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.885272 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 16:38:01.892748 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.892725 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-installation-pull-secrets\") pod \"image-registry-7fc469b84-sgvkz\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:01.970531 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:01.970499 2571 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-private-configuration: failed to sync secret cache: timed out waiting for the condition Apr 23 16:38:01.970531 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:01.970512 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: failed to sync secret cache: timed out waiting for the condition Apr 23 16:38:01.970788 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:01.970546 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fc469b84-sgvkz: failed to sync secret cache: timed out waiting for the condition Apr 23 16:38:01.970788 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:01.970508 2571 configmap.go:193] Couldn't get configMap openshift-image-registry/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Apr 23 16:38:01.970788 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:01.970603 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-tls podName:f38d7102-b6ce-4938-8954-91d3894e8c7c nodeName:}" failed. No retries permitted until 2026-04-23 16:38:02.470581835 +0000 UTC m=+165.552024387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-tls") pod "image-registry-7fc469b84-sgvkz" (UID: "f38d7102-b6ce-4938-8954-91d3894e8c7c") : failed to sync secret cache: timed out waiting for the condition Apr 23 16:38:01.970788 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:01.970630 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-trusted-ca podName:f38d7102-b6ce-4938-8954-91d3894e8c7c nodeName:}" failed. No retries permitted until 2026-04-23 16:38:02.470612086 +0000 UTC m=+165.552054636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-trusted-ca") pod "image-registry-7fc469b84-sgvkz" (UID: "f38d7102-b6ce-4938-8954-91d3894e8c7c") : failed to sync configmap cache: timed out waiting for the condition Apr 23 16:38:01.970788 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:01.970648 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-image-registry-private-configuration podName:f38d7102-b6ce-4938-8954-91d3894e8c7c nodeName:}" failed. No retries permitted until 2026-04-23 16:38:02.470637973 +0000 UTC m=+165.552080529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-private-configuration" (UniqueName: "kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-image-registry-private-configuration") pod "image-registry-7fc469b84-sgvkz" (UID: "f38d7102-b6ce-4938-8954-91d3894e8c7c") : failed to sync secret cache: timed out waiting for the condition Apr 23 16:38:01.979967 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.979943 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-installation-pull-secrets\") pod \"f38d7102-b6ce-4938-8954-91d3894e8c7c\" (UID: \"f38d7102-b6ce-4938-8954-91d3894e8c7c\") " Apr 23 16:38:01.982288 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.982257 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f38d7102-b6ce-4938-8954-91d3894e8c7c" (UID: "f38d7102-b6ce-4938-8954-91d3894e8c7c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:01.988987 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.988961 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z2gnb" event={"ID":"cf289e76-9f00-4d50-b667-8c9cee95e651","Type":"ContainerStarted","Data":"f1d6f7bc393b74c1af7920220ce64d5d0dff3dfd5c50a0120b364ac17346abe2"} Apr 23 16:38:01.988987 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.988997 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z2gnb" event={"ID":"cf289e76-9f00-4d50-b667-8c9cee95e651","Type":"ContainerStarted","Data":"c93504e8ba23a1e951d26a9324ca8a2b64c85b636d62ab24c6fda080e2d1e998"} Apr 23 16:38:01.990166 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.990133 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" event={"ID":"bb5d4845-68ad-41c5-bc80-0d399b962c20","Type":"ContainerStarted","Data":"db4735445723fef72c5c6d9f999f87eeb49d45be8d4712babf7691524b32cd27"} Apr 23 16:38:01.991584 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.991561 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bbhgp" event={"ID":"ec4a3bc4-5872-47af-b0d2-34143e0f2dea","Type":"ContainerStarted","Data":"3abddfe3e8bb9f8ddcb52c3b2432c8625c2b3be998dc72034659a9a6a6e09620"} Apr 23 16:38:01.993101 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.993073 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mfgns" event={"ID":"1026d702-16e3-45e1-821e-0f0a702f27d3","Type":"ContainerStarted","Data":"7bcca8e19d8edf67098a398941df8c9ec2cd2e66e85a2fc4e827fff7c3bdb12d"} Apr 23 16:38:01.993206 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.993104 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mfgns" event={"ID":"1026d702-16e3-45e1-821e-0f0a702f27d3","Type":"ContainerStarted","Data":"7971ce71525869146c4dddfe68a83da2810f6e75bdd84d7c92556a55f5949405"} Apr 23 16:38:01.993466 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:01.993423 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fc469b84-sgvkz" Apr 23 16:38:02.017864 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.017825 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bbhgp" podStartSLOduration=130.097377293 podStartE2EDuration="2m12.017813922s" podCreationTimestamp="2026-04-23 16:35:50 +0000 UTC" firstStartedPulling="2026-04-23 16:37:59.22840391 +0000 UTC m=+162.309846457" lastFinishedPulling="2026-04-23 16:38:01.148840525 +0000 UTC m=+164.230283086" observedRunningTime="2026-04-23 16:38:02.014939201 +0000 UTC m=+165.096381770" watchObservedRunningTime="2026-04-23 16:38:02.017813922 +0000 UTC m=+165.099256469" Apr 23 16:38:02.038543 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.038499 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mfgns" podStartSLOduration=130.104123433 podStartE2EDuration="2m12.03848368s" podCreationTimestamp="2026-04-23 16:35:50 +0000 UTC" firstStartedPulling="2026-04-23 16:37:59.209636879 +0000 UTC m=+162.291079427" lastFinishedPulling="2026-04-23 16:38:01.143997114 +0000 UTC m=+164.225439674" observedRunningTime="2026-04-23 16:38:02.038423119 +0000 UTC m=+165.119865688" watchObservedRunningTime="2026-04-23 16:38:02.03848368 +0000 UTC m=+165.119926249" Apr 23 16:38:02.070880 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.070618 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7fc469b84-sgvkz"] Apr 23 16:38:02.080279 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.080052 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7fc469b84-sgvkz"] Apr 23 16:38:02.081903 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.081551 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-installation-pull-secrets\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:38:02.105489 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.105104 2571 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq" secret="" err="failed to sync secret cache: timed out waiting for the condition" Apr 23 16:38:02.105489 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.105201 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq" Apr 23 16:38:02.182351 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.182268 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f38d7102-b6ce-4938-8954-91d3894e8c7c-image-registry-private-configuration\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:38:02.182351 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.182306 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38d7102-b6ce-4938-8954-91d3894e8c7c-trusted-ca\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:38:02.182351 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.182319 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f38d7102-b6ce-4938-8954-91d3894e8c7c-registry-tls\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:38:02.364859 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.364829 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-flnpv\"" Apr 23 16:38:02.516177 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.516149 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq"] Apr 23 16:38:02.519225 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:38:02.519196 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c592392_1c6a_4812_86ed_e1ed2f002ce0.slice/crio-316b30305a366f6c9f1f98ac9613598d5eb2efbfbf2d6d158941dffa52340333 WatchSource:0}: Error finding container 316b30305a366f6c9f1f98ac9613598d5eb2efbfbf2d6d158941dffa52340333: Status 404 returned error can't find the container with id 316b30305a366f6c9f1f98ac9613598d5eb2efbfbf2d6d158941dffa52340333 Apr 23 16:38:02.997260 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.997219 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z2gnb" event={"ID":"cf289e76-9f00-4d50-b667-8c9cee95e651","Type":"ContainerStarted","Data":"65f2ebd00ad6047629ddc2da249115e53e79e5a3f2a5b4fe8a3d3f721bc251d7"} Apr 23 16:38:02.998627 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.998591 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq" event={"ID":"5c592392-1c6a-4812-86ed-e1ed2f002ce0","Type":"ContainerStarted","Data":"d716c4d1c70a3154782422b0931a63406df8992cc4a7d926781371e6a455e9b4"} Apr 23 16:38:02.998792 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.998631 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq" event={"ID":"5c592392-1c6a-4812-86ed-e1ed2f002ce0","Type":"ContainerStarted","Data":"316b30305a366f6c9f1f98ac9613598d5eb2efbfbf2d6d158941dffa52340333"} Apr 23 16:38:02.999995 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:02.999970 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" event={"ID":"bb5d4845-68ad-41c5-bc80-0d399b962c20","Type":"ContainerStarted","Data":"30c3de3d0de1d0fbeb6974bc558b9a223cfa0fe0211c0957fba57d1523afb7b3"} Apr 23 16:38:03.000323 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:03.000305 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mfgns" Apr 23 16:38:03.016493 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:03.016453 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6mngq" podStartSLOduration=3.016442567 podStartE2EDuration="3.016442567s" podCreationTimestamp="2026-04-23 16:38:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:38:03.016154853 +0000 UTC m=+166.097597424" watchObservedRunningTime="2026-04-23 16:38:03.016442567 +0000 UTC m=+166.097885135" Apr 23 16:38:03.033520 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:03.033472 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7fsct" podStartSLOduration=1.959800785 podStartE2EDuration="3.033456219s" podCreationTimestamp="2026-04-23 16:38:00 +0000 UTC" firstStartedPulling="2026-04-23 16:38:01.323028957 +0000 UTC m=+164.404471508" lastFinishedPulling="2026-04-23 16:38:02.396684392 +0000 UTC m=+165.478126942" observedRunningTime="2026-04-23 16:38:03.031971997 +0000 UTC m=+166.113414566" watchObservedRunningTime="2026-04-23 16:38:03.033456219 +0000 UTC m=+166.114898792" Apr 23 16:38:03.511474 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:03.511434 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f38d7102-b6ce-4938-8954-91d3894e8c7c" path="/var/lib/kubelet/pods/f38d7102-b6ce-4938-8954-91d3894e8c7c/volumes" Apr 23 16:38:03.992662 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:03.992638 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-br4xh"] Apr 23 16:38:03.996937 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:03.996916 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.001833 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.001509 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 16:38:04.001833 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.001540 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 16:38:04.001833 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.001816 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:38:04.002365 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.002024 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-k4zpl\"" Apr 23 16:38:04.009435 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.009413 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-br4xh"] Apr 23 16:38:04.099986 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.099957 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddxn6\" (UniqueName: \"kubernetes.io/projected/65408c14-ea04-45c3-8378-2138509045ff-kube-api-access-ddxn6\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.100126 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.100035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/65408c14-ea04-45c3-8378-2138509045ff-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.100126 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.100085 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65408c14-ea04-45c3-8378-2138509045ff-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.100255 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.100135 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/65408c14-ea04-45c3-8378-2138509045ff-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.201108 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.201079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/65408c14-ea04-45c3-8378-2138509045ff-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.201214 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.201124 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65408c14-ea04-45c3-8378-2138509045ff-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.201214 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.201148 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/65408c14-ea04-45c3-8378-2138509045ff-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.201214 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.201186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddxn6\" (UniqueName: \"kubernetes.io/projected/65408c14-ea04-45c3-8378-2138509045ff-kube-api-access-ddxn6\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.201320 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:04.201301 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 16:38:04.201368 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:04.201357 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65408c14-ea04-45c3-8378-2138509045ff-prometheus-operator-tls podName:65408c14-ea04-45c3-8378-2138509045ff nodeName:}" failed. No retries permitted until 2026-04-23 16:38:04.701340749 +0000 UTC m=+167.782783298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/65408c14-ea04-45c3-8378-2138509045ff-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-br4xh" (UID: "65408c14-ea04-45c3-8378-2138509045ff") : secret "prometheus-operator-tls" not found Apr 23 16:38:04.201785 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.201769 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65408c14-ea04-45c3-8378-2138509045ff-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.203656 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.203634 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/65408c14-ea04-45c3-8378-2138509045ff-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.212321 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.212303 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddxn6\" (UniqueName: \"kubernetes.io/projected/65408c14-ea04-45c3-8378-2138509045ff-kube-api-access-ddxn6\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.704246 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.704214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/65408c14-ea04-45c3-8378-2138509045ff-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.706514 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.706490 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/65408c14-ea04-45c3-8378-2138509045ff-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-br4xh\" (UID: \"65408c14-ea04-45c3-8378-2138509045ff\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:04.919822 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:04.919787 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" Apr 23 16:38:05.011053 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:05.011014 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z2gnb" event={"ID":"cf289e76-9f00-4d50-b667-8c9cee95e651","Type":"ContainerStarted","Data":"d7283da133da1a8b1d94d61ba340e0a0a29bdf64f46ae43ea7f3d04f136f4360"} Apr 23 16:38:05.032532 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:05.032478 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-z2gnb" podStartSLOduration=2.516793627 podStartE2EDuration="5.032464002s" podCreationTimestamp="2026-04-23 16:38:00 +0000 UTC" firstStartedPulling="2026-04-23 16:38:01.463632989 +0000 UTC m=+164.545075536" lastFinishedPulling="2026-04-23 16:38:03.979303351 +0000 UTC m=+167.060745911" observedRunningTime="2026-04-23 16:38:05.031926604 +0000 UTC m=+168.113369176" watchObservedRunningTime="2026-04-23 16:38:05.032464002 +0000 UTC m=+168.113906570" Apr 23 16:38:05.036284 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:05.036261 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-br4xh"] Apr 23 16:38:05.039122 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:38:05.039100 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65408c14_ea04_45c3_8378_2138509045ff.slice/crio-58fd899b020b21d41f5e1e6067e9290c6390f4bac7b9ff636650ffc5112508e1 WatchSource:0}: Error finding container 58fd899b020b21d41f5e1e6067e9290c6390f4bac7b9ff636650ffc5112508e1: Status 404 returned error can't find the container with id 58fd899b020b21d41f5e1e6067e9290c6390f4bac7b9ff636650ffc5112508e1 Apr 23 16:38:06.015059 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:06.014979 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" event={"ID":"65408c14-ea04-45c3-8378-2138509045ff","Type":"ContainerStarted","Data":"58fd899b020b21d41f5e1e6067e9290c6390f4bac7b9ff636650ffc5112508e1"} Apr 23 16:38:07.511658 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:07.511625 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:38:08.022513 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:08.022483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" event={"ID":"65408c14-ea04-45c3-8378-2138509045ff","Type":"ContainerStarted","Data":"e4630e0f83b4496fc6d678d4d447138ff91f69024e7b52a07149353a782a4077"} Apr 23 16:38:08.022513 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:08.022515 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" event={"ID":"65408c14-ea04-45c3-8378-2138509045ff","Type":"ContainerStarted","Data":"6ff2af4a7b000c95c6a03a880baced4705b88fe14c0652b0baea1944f671b3af"} Apr 23 16:38:08.044040 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:08.043992 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-br4xh" podStartSLOduration=3.115877794 podStartE2EDuration="5.04397872s" podCreationTimestamp="2026-04-23 16:38:03 +0000 UTC" firstStartedPulling="2026-04-23 16:38:05.040928713 +0000 UTC m=+168.122371260" lastFinishedPulling="2026-04-23 16:38:06.969029639 +0000 UTC m=+170.050472186" observedRunningTime="2026-04-23 16:38:08.042582762 +0000 UTC m=+171.124025341" watchObservedRunningTime="2026-04-23 16:38:08.04397872 +0000 UTC m=+171.125421289" Apr 23 16:38:10.382272 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.382232 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xzd44"] Apr 23 16:38:10.384907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.384883 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.389510 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.389491 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 16:38:10.389749 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.389733 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 16:38:10.389968 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.389916 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-c996x\"" Apr 23 16:38:10.390887 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.390860 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 16:38:10.399947 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.399929 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xzd44"] Apr 23 16:38:10.407765 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.407746 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9kztc"] Apr 23 16:38:10.409961 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.409947 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.412383 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.412351 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:38:10.412521 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.412493 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:38:10.412894 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.412880 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-292nf\"" Apr 23 16:38:10.413007 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.412992 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:38:10.448674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.448647 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.448819 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.448720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24bj\" (UniqueName: \"kubernetes.io/projected/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-api-access-g24bj\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.448867 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.448817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.448944 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.448866 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.448944 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.448919 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.449069 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.448944 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.550137 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550106 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.550273 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550144 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-accelerators-collector-config\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.550273 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550167 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-sys\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.550273 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550195 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.550273 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550246 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g24bj\" (UniqueName: \"kubernetes.io/projected/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-api-access-g24bj\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.550413 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550278 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-root\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.550413 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550295 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-wtmp\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.550413 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550312 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-textfile\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.550515 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-metrics-client-ca\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.550515 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.550515 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550493 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcdw9\" (UniqueName: \"kubernetes.io/projected/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-kube-api-access-wcdw9\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.550671 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:10.550532 2571 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 23 16:38:10.550671 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.550671 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550561 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-tls\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.550671 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:38:10.550586 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-tls podName:2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7 nodeName:}" failed. No retries permitted until 2026-04-23 16:38:11.050568432 +0000 UTC m=+174.132010984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-xzd44" (UID: "2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7") : secret "kube-state-metrics-tls" not found Apr 23 16:38:10.550671 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.550882 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550730 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.550882 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.550873 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.551040 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.551023 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.551412 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.551389 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.553004 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.552982 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.561010 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.560988 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24bj\" (UniqueName: \"kubernetes.io/projected/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-api-access-g24bj\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:10.651552 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651461 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcdw9\" (UniqueName: \"kubernetes.io/projected/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-kube-api-access-wcdw9\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.651552 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-tls\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.651795 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651569 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.651795 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651608 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-accelerators-collector-config\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.651795 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651647 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-sys\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.651795 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-root\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.651795 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651736 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-wtmp\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.651795 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651768 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-textfile\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.652092 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-metrics-client-ca\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.652092 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651869 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-root\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.652092 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.651923 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-sys\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.652092 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.652028 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-wtmp\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.652296 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.652268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-textfile\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.652592 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.652335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-metrics-client-ca\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.652592 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.652339 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-accelerators-collector-config\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.654110 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.654089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.654448 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.654426 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-node-exporter-tls\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.660629 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.660610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcdw9\" (UniqueName: \"kubernetes.io/projected/7a0e4d71-73b2-469c-a183-6f2dd1c34d66-kube-api-access-wcdw9\") pod \"node-exporter-9kztc\" (UID: \"7a0e4d71-73b2-469c-a183-6f2dd1c34d66\") " pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.718768 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:10.718738 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9kztc" Apr 23 16:38:10.727171 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:38:10.727143 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a0e4d71_73b2_469c_a183_6f2dd1c34d66.slice/crio-2e459fc33f8902277e9714d6315909d36486311a6bf4b4b38e19d90ae7df353c WatchSource:0}: Error finding container 2e459fc33f8902277e9714d6315909d36486311a6bf4b4b38e19d90ae7df353c: Status 404 returned error can't find the container with id 2e459fc33f8902277e9714d6315909d36486311a6bf4b4b38e19d90ae7df353c Apr 23 16:38:11.031182 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.031145 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kztc" event={"ID":"7a0e4d71-73b2-469c-a183-6f2dd1c34d66","Type":"ContainerStarted","Data":"2e459fc33f8902277e9714d6315909d36486311a6bf4b4b38e19d90ae7df353c"} Apr 23 16:38:11.055515 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.055488 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:11.058092 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.058061 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xzd44\" (UID: \"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:11.293888 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.293812 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" Apr 23 16:38:11.485751 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.485729 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xzd44"] Apr 23 16:38:11.488305 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:38:11.488197 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eb8a4dd_b6bb_4cbb_849e_6fb6c06857b7.slice/crio-077efe07e06505feb24f19ac93512d288ece7df7e98c33efbbb4c04dd7717258 WatchSource:0}: Error finding container 077efe07e06505feb24f19ac93512d288ece7df7e98c33efbbb4c04dd7717258: Status 404 returned error can't find the container with id 077efe07e06505feb24f19ac93512d288ece7df7e98c33efbbb4c04dd7717258 Apr 23 16:38:11.502473 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.502455 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:38:11.511574 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.511547 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.517084 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.517066 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 16:38:11.517193 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.517065 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 16:38:11.517193 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.517065 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 16:38:11.517320 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.517074 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4hrhq\"" Apr 23 16:38:11.517320 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.517082 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 16:38:11.517464 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.517094 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 16:38:11.517558 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.517102 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 16:38:11.517615 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.517576 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 16:38:11.517615 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.517586 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 16:38:11.517782 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.517633 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 16:38:11.526048 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.526026 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:38:11.660469 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660443 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660592 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660480 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-out\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660592 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660502 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660592 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660523 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660733 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660591 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660733 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660629 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660733 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660646 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660733 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660733 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660719 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trg27\" (UniqueName: \"kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-kube-api-access-trg27\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660884 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660770 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-web-config\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660884 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660798 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-volume\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660884 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.660884 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.660873 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.761880 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.761846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762059 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.761893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762059 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762023 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-out\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762180 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762180 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762180 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762338 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762177 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762338 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762338 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762338 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trg27\" (UniqueName: \"kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-kube-api-access-trg27\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762338 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-web-config\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762558 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762348 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-volume\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.762558 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.763409 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.762891 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.763409 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.763120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.763409 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.763153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.766024 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.766001 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.766024 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.766015 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-web-config\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.766189 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.766154 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.766277 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.766257 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-out\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.766405 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.766384 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.766549 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.766531 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-volume\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.767154 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.767128 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.767466 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.767441 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.767688 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.767668 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.774089 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.774068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trg27\" (UniqueName: \"kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-kube-api-access-trg27\") pod \"alertmanager-main-0\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.822553 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.822521 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:38:11.951342 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:11.951300 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:38:11.954570 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:38:11.954541 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e6a68c5_0ae9_481c_94a5_b786ece7db4e.slice/crio-cbb76a66130dbde0cba2aff0712b9625426ad87b41d47a23f8e58426d1a3fb54 WatchSource:0}: Error finding container cbb76a66130dbde0cba2aff0712b9625426ad87b41d47a23f8e58426d1a3fb54: Status 404 returned error can't find the container with id cbb76a66130dbde0cba2aff0712b9625426ad87b41d47a23f8e58426d1a3fb54 Apr 23 16:38:12.036792 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:12.036755 2571 generic.go:358] "Generic (PLEG): container finished" podID="7a0e4d71-73b2-469c-a183-6f2dd1c34d66" containerID="1968981565b802542f640f241d093fdc577ade8ad95d4856f8b02cfb6d2b29d3" exitCode=0 Apr 23 16:38:12.036959 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:12.036851 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kztc" event={"ID":"7a0e4d71-73b2-469c-a183-6f2dd1c34d66","Type":"ContainerDied","Data":"1968981565b802542f640f241d093fdc577ade8ad95d4856f8b02cfb6d2b29d3"} Apr 23 16:38:12.038358 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:12.038335 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerStarted","Data":"cbb76a66130dbde0cba2aff0712b9625426ad87b41d47a23f8e58426d1a3fb54"} Apr 23 16:38:12.039644 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:12.039616 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" event={"ID":"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7","Type":"ContainerStarted","Data":"077efe07e06505feb24f19ac93512d288ece7df7e98c33efbbb4c04dd7717258"} Apr 23 16:38:13.005882 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:13.005853 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mfgns" Apr 23 16:38:13.045443 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:13.045403 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kztc" event={"ID":"7a0e4d71-73b2-469c-a183-6f2dd1c34d66","Type":"ContainerStarted","Data":"4cc1982be4a93914a54bf67c453e4d46a94eeb959a8740c80d29168607a1d972"} Apr 23 16:38:13.045576 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:13.045451 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9kztc" event={"ID":"7a0e4d71-73b2-469c-a183-6f2dd1c34d66","Type":"ContainerStarted","Data":"6031025030f53159b086f628edd4810014688f60dbde4bae297a7a8eaf1e4743"} Apr 23 16:38:13.049396 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:13.048676 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerStarted","Data":"03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5"} Apr 23 16:38:13.051321 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:13.051286 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" event={"ID":"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7","Type":"ContainerStarted","Data":"d5dd08b5644603cf2422f09a13ceee5102c7069b26d7af0f31069a55b022982a"} Apr 23 16:38:13.074857 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:13.074800 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9kztc" podStartSLOduration=2.420484268 podStartE2EDuration="3.074780284s" podCreationTimestamp="2026-04-23 16:38:10 +0000 UTC" firstStartedPulling="2026-04-23 16:38:10.728832072 +0000 UTC m=+173.810274619" lastFinishedPulling="2026-04-23 16:38:11.383128088 +0000 UTC m=+174.464570635" observedRunningTime="2026-04-23 16:38:13.073825613 +0000 UTC m=+176.155268195" watchObservedRunningTime="2026-04-23 16:38:13.074780284 +0000 UTC m=+176.156222854" Apr 23 16:38:14.056086 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.056044 2571 generic.go:358] "Generic (PLEG): container finished" podID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerID="03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5" exitCode=0 Apr 23 16:38:14.056541 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.056091 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerDied","Data":"03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5"} Apr 23 16:38:14.058236 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.058201 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" event={"ID":"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7","Type":"ContainerStarted","Data":"7fb0f0361eb513f3df4f318c3489e680021602bb2efd63aa50ebbe1fe24744d7"} Apr 23 16:38:14.058356 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.058244 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" event={"ID":"2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7","Type":"ContainerStarted","Data":"d717567ab5f99c5d964d9bf458908bf39769ab1ac3384183fcfc119d79f7e6ea"} Apr 23 16:38:14.080435 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.080379 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-xzd44" podStartSLOduration=2.641670897 podStartE2EDuration="4.080365905s" podCreationTimestamp="2026-04-23 16:38:10 +0000 UTC" firstStartedPulling="2026-04-23 16:38:11.490528415 +0000 UTC m=+174.571970963" lastFinishedPulling="2026-04-23 16:38:12.929223425 +0000 UTC m=+176.010665971" observedRunningTime="2026-04-23 16:38:14.07806305 +0000 UTC m=+177.159505630" watchObservedRunningTime="2026-04-23 16:38:14.080365905 +0000 UTC m=+177.161808474" Apr 23 16:38:14.868434 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.868354 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5b568c66f8-rk9gr"] Apr 23 16:38:14.870774 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.870754 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:14.873998 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.873974 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 16:38:14.875605 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.875292 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 16:38:14.875605 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.875367 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 16:38:14.875605 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.875426 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-3uhhudnmug73i\"" Apr 23 16:38:14.875605 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.875446 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 16:38:14.875605 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.875300 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-dhn8m\"" Apr 23 16:38:14.885206 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.885183 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5b568c66f8-rk9gr"] Apr 23 16:38:14.993811 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.993786 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4sgz\" (UniqueName: \"kubernetes.io/projected/a7584369-a97b-4fb0-9628-0b8f04bb1761-kube-api-access-j4sgz\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:14.993956 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.993838 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7584369-a97b-4fb0-9628-0b8f04bb1761-client-ca-bundle\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:14.993956 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.993908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a7584369-a97b-4fb0-9628-0b8f04bb1761-secret-metrics-server-client-certs\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:14.993956 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.993943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7584369-a97b-4fb0-9628-0b8f04bb1761-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:14.994054 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.993972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a7584369-a97b-4fb0-9628-0b8f04bb1761-metrics-server-audit-profiles\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:14.994054 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.993995 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a7584369-a97b-4fb0-9628-0b8f04bb1761-secret-metrics-server-tls\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:14.994118 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:14.994088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a7584369-a97b-4fb0-9628-0b8f04bb1761-audit-log\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.063856 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.063819 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerStarted","Data":"a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6"} Apr 23 16:38:15.063856 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.063859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerStarted","Data":"7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e"} Apr 23 16:38:15.064254 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.063871 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerStarted","Data":"0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073"} Apr 23 16:38:15.064254 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.063879 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerStarted","Data":"64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1"} Apr 23 16:38:15.064254 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.063887 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerStarted","Data":"c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0"} Apr 23 16:38:15.095096 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.095049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4sgz\" (UniqueName: \"kubernetes.io/projected/a7584369-a97b-4fb0-9628-0b8f04bb1761-kube-api-access-j4sgz\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.095232 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.095117 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7584369-a97b-4fb0-9628-0b8f04bb1761-client-ca-bundle\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.095232 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.095152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a7584369-a97b-4fb0-9628-0b8f04bb1761-secret-metrics-server-client-certs\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.095232 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.095174 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7584369-a97b-4fb0-9628-0b8f04bb1761-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.095232 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.095213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a7584369-a97b-4fb0-9628-0b8f04bb1761-metrics-server-audit-profiles\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.095422 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.095241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a7584369-a97b-4fb0-9628-0b8f04bb1761-secret-metrics-server-tls\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.095422 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.095298 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a7584369-a97b-4fb0-9628-0b8f04bb1761-audit-log\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.095685 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.095662 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a7584369-a97b-4fb0-9628-0b8f04bb1761-audit-log\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.096378 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.096341 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a7584369-a97b-4fb0-9628-0b8f04bb1761-metrics-server-audit-profiles\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.096493 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.096398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7584369-a97b-4fb0-9628-0b8f04bb1761-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.098095 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.098053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a7584369-a97b-4fb0-9628-0b8f04bb1761-secret-metrics-server-client-certs\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.098202 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.098166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7584369-a97b-4fb0-9628-0b8f04bb1761-client-ca-bundle\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.098426 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.098411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a7584369-a97b-4fb0-9628-0b8f04bb1761-secret-metrics-server-tls\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.105867 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.105814 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4sgz\" (UniqueName: \"kubernetes.io/projected/a7584369-a97b-4fb0-9628-0b8f04bb1761-kube-api-access-j4sgz\") pod \"metrics-server-5b568c66f8-rk9gr\" (UID: \"a7584369-a97b-4fb0-9628-0b8f04bb1761\") " pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.180928 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.180853 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:15.321207 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:15.321170 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5b568c66f8-rk9gr"] Apr 23 16:38:15.328391 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:38:15.328364 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7584369_a97b_4fb0_9628_0b8f04bb1761.slice/crio-71442aee44ae700e6eb42ad71f2f16832a23bb006832a45b0309c35e77023383 WatchSource:0}: Error finding container 71442aee44ae700e6eb42ad71f2f16832a23bb006832a45b0309c35e77023383: Status 404 returned error can't find the container with id 71442aee44ae700e6eb42ad71f2f16832a23bb006832a45b0309c35e77023383 Apr 23 16:38:16.068606 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:16.068563 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" event={"ID":"a7584369-a97b-4fb0-9628-0b8f04bb1761","Type":"ContainerStarted","Data":"71442aee44ae700e6eb42ad71f2f16832a23bb006832a45b0309c35e77023383"} Apr 23 16:38:16.072622 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:16.072582 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerStarted","Data":"a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23"} Apr 23 16:38:16.150423 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:16.150309 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.7541301969999998 podStartE2EDuration="5.150248766s" podCreationTimestamp="2026-04-23 16:38:11 +0000 UTC" firstStartedPulling="2026-04-23 16:38:11.956518625 +0000 UTC m=+175.037961172" lastFinishedPulling="2026-04-23 16:38:15.352637194 +0000 UTC m=+178.434079741" observedRunningTime="2026-04-23 16:38:16.142194138 +0000 UTC m=+179.223636745" watchObservedRunningTime="2026-04-23 16:38:16.150248766 +0000 UTC m=+179.231691334" Apr 23 16:38:17.078744 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:17.078626 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" event={"ID":"a7584369-a97b-4fb0-9628-0b8f04bb1761","Type":"ContainerStarted","Data":"4dff9f690709a7bcb90533eb8eef11161dc642348a24b16e4f0bb0a935c1c2e5"} Apr 23 16:38:17.153205 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:17.153147 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" podStartSLOduration=1.7178328349999998 podStartE2EDuration="3.153127376s" podCreationTimestamp="2026-04-23 16:38:14 +0000 UTC" firstStartedPulling="2026-04-23 16:38:15.330165954 +0000 UTC m=+178.411608501" lastFinishedPulling="2026-04-23 16:38:16.765460494 +0000 UTC m=+179.846903042" observedRunningTime="2026-04-23 16:38:17.152834551 +0000 UTC m=+180.234277122" watchObservedRunningTime="2026-04-23 16:38:17.153127376 +0000 UTC m=+180.234569944" Apr 23 16:38:35.181621 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:35.181588 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:35.181621 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:35.181627 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:53.182130 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:53.182058 2571 generic.go:358] "Generic (PLEG): container finished" podID="b909c180-8bf1-44a4-8a87-c6d4756c787a" containerID="4fb4a87e1c0130a9d892556d3b87e758d813a5824e6b5fec5ec167f120c99ae3" exitCode=0 Apr 23 16:38:53.182514 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:53.182133 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d97rf" event={"ID":"b909c180-8bf1-44a4-8a87-c6d4756c787a","Type":"ContainerDied","Data":"4fb4a87e1c0130a9d892556d3b87e758d813a5824e6b5fec5ec167f120c99ae3"} Apr 23 16:38:53.182514 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:53.182480 2571 scope.go:117] "RemoveContainer" containerID="4fb4a87e1c0130a9d892556d3b87e758d813a5824e6b5fec5ec167f120c99ae3" Apr 23 16:38:54.186506 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:54.186471 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d97rf" event={"ID":"b909c180-8bf1-44a4-8a87-c6d4756c787a","Type":"ContainerStarted","Data":"e8792a211ebbb399266e109bae5fac1d1d6334dce43ce7cbca50d98cadc19f8a"} Apr 23 16:38:55.186763 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:55.186730 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:38:55.190765 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:38:55.190742 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5b568c66f8-rk9gr" Apr 23 16:39:29.211402 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:29.211346 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:39:29.213684 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:29.213663 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f25a094-e342-4690-8028-f1a3ddd77829-metrics-certs\") pod \"network-metrics-daemon-hsxbc\" (UID: \"9f25a094-e342-4690-8028-f1a3ddd77829\") " pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:39:29.416321 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:29.416291 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nldpm\"" Apr 23 16:39:29.422943 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:29.422926 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hsxbc" Apr 23 16:39:29.548760 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:29.548738 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hsxbc"] Apr 23 16:39:29.550655 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:39:29.550629 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f25a094_e342_4690_8028_f1a3ddd77829.slice/crio-0792bd685ec5a755792c082f9c6c1d5af143c622a2ba11fa190a9fd17bd5661f WatchSource:0}: Error finding container 0792bd685ec5a755792c082f9c6c1d5af143c622a2ba11fa190a9fd17bd5661f: Status 404 returned error can't find the container with id 0792bd685ec5a755792c082f9c6c1d5af143c622a2ba11fa190a9fd17bd5661f Apr 23 16:39:30.291063 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:30.291023 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hsxbc" event={"ID":"9f25a094-e342-4690-8028-f1a3ddd77829","Type":"ContainerStarted","Data":"0792bd685ec5a755792c082f9c6c1d5af143c622a2ba11fa190a9fd17bd5661f"} Apr 23 16:39:30.795199 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:30.795160 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:30.795683 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:30.795652 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="alertmanager" containerID="cri-o://c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0" gracePeriod=120 Apr 23 16:39:30.795835 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:30.795723 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy-metric" containerID="cri-o://a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6" gracePeriod=120 Apr 23 16:39:30.795835 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:30.795770 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy" containerID="cri-o://7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e" gracePeriod=120 Apr 23 16:39:30.795835 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:30.795806 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="config-reloader" containerID="cri-o://64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1" gracePeriod=120 Apr 23 16:39:30.795835 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:30.795820 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="prom-label-proxy" containerID="cri-o://a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23" gracePeriod=120 Apr 23 16:39:30.796022 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:30.795752 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy-web" containerID="cri-o://0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073" gracePeriod=120 Apr 23 16:39:31.305788 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:31.305751 2571 generic.go:358] "Generic (PLEG): container finished" podID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerID="a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23" exitCode=0 Apr 23 16:39:31.305788 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:31.305781 2571 generic.go:358] "Generic (PLEG): container finished" podID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerID="7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e" exitCode=0 Apr 23 16:39:31.305788 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:31.305790 2571 generic.go:358] "Generic (PLEG): container finished" podID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerID="64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1" exitCode=0 Apr 23 16:39:31.305788 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:31.305797 2571 generic.go:358] "Generic (PLEG): container finished" podID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerID="c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0" exitCode=0 Apr 23 16:39:31.306332 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:31.305815 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerDied","Data":"a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23"} Apr 23 16:39:31.306332 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:31.305845 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerDied","Data":"7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e"} Apr 23 16:39:31.306332 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:31.305856 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerDied","Data":"64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1"} Apr 23 16:39:31.306332 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:31.305864 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerDied","Data":"c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0"} Apr 23 16:39:31.307389 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:31.307356 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hsxbc" event={"ID":"9f25a094-e342-4690-8028-f1a3ddd77829","Type":"ContainerStarted","Data":"8dcc61bf69bb279637468e4ea3d15404de0f28a129eab4334b31adfd83b302d2"} Apr 23 16:39:31.307485 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:31.307392 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hsxbc" event={"ID":"9f25a094-e342-4690-8028-f1a3ddd77829","Type":"ContainerStarted","Data":"2c7036c8cf68141e15f31e9f83a3a6d68f80a2c57ecebde358cc060f6a4c6084"} Apr 23 16:39:32.038145 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.038125 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.070200 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.070156 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hsxbc" podStartSLOduration=254.105807828 podStartE2EDuration="4m15.070142428s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:39:29.552344657 +0000 UTC m=+252.633787203" lastFinishedPulling="2026-04-23 16:39:30.516679256 +0000 UTC m=+253.598121803" observedRunningTime="2026-04-23 16:39:31.324242988 +0000 UTC m=+254.405685556" watchObservedRunningTime="2026-04-23 16:39:32.070142428 +0000 UTC m=+255.151584995" Apr 23 16:39:32.136098 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136025 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-web-config\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.136098 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136065 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-main-tls\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.136098 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136094 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-main-db\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.136341 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136122 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-web\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.136341 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136161 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.136341 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136198 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.136341 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136223 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-cluster-tls-config\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.136537 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136483 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:39:32.136621 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136599 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trg27\" (UniqueName: \"kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-kube-api-access-trg27\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.136674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136650 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-out\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.136757 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136683 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-tls-assets\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.136757 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136738 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-volume\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.137021 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136763 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-metrics-client-ca\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.137021 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.136974 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-trusted-ca-bundle\") pod \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\" (UID: \"1e6a68c5-0ae9-481c-94a5-b786ece7db4e\") " Apr 23 16:39:32.137474 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.137227 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-main-db\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.137586 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.137533 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:32.137679 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.137641 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:32.139156 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.139128 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:32.140014 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.139600 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:32.140014 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.139813 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:32.140014 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.139908 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-volume" (OuterVolumeSpecName: "config-volume") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:32.140196 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.140091 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:32.140644 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.140611 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-kube-api-access-trg27" (OuterVolumeSpecName: "kube-api-access-trg27") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "kube-api-access-trg27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:39:32.140842 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.140821 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-out" (OuterVolumeSpecName: "config-out") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:39:32.141200 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.141177 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:39:32.143727 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.143689 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:32.149921 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.149901 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-web-config" (OuterVolumeSpecName: "web-config") pod "1e6a68c5-0ae9-481c-94a5-b786ece7db4e" (UID: "1e6a68c5-0ae9-481c-94a5-b786ece7db4e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:32.237783 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237760 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-main-tls\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237783 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237781 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237792 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237801 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237811 2571 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-cluster-tls-config\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237819 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trg27\" (UniqueName: \"kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-kube-api-access-trg27\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237827 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-out\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237836 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-tls-assets\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237843 2571 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-config-volume\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237850 2571 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-metrics-client-ca\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237858 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.237917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.237866 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6a68c5-0ae9-481c-94a5-b786ece7db4e-web-config\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:39:32.313183 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.313158 2571 generic.go:358] "Generic (PLEG): container finished" podID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerID="a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6" exitCode=0 Apr 23 16:39:32.313183 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.313177 2571 generic.go:358] "Generic (PLEG): container finished" podID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerID="0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073" exitCode=0 Apr 23 16:39:32.313567 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.313240 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerDied","Data":"a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6"} Apr 23 16:39:32.313567 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.313262 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.313567 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.313284 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerDied","Data":"0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073"} Apr 23 16:39:32.313567 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.313302 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e6a68c5-0ae9-481c-94a5-b786ece7db4e","Type":"ContainerDied","Data":"cbb76a66130dbde0cba2aff0712b9625426ad87b41d47a23f8e58426d1a3fb54"} Apr 23 16:39:32.313567 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.313324 2571 scope.go:117] "RemoveContainer" containerID="a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23" Apr 23 16:39:32.320674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.320653 2571 scope.go:117] "RemoveContainer" containerID="a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6" Apr 23 16:39:32.327552 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.327535 2571 scope.go:117] "RemoveContainer" containerID="7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e" Apr 23 16:39:32.334061 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.334043 2571 scope.go:117] "RemoveContainer" containerID="0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073" Apr 23 16:39:32.340838 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.340777 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:32.341033 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.341013 2571 scope.go:117] "RemoveContainer" containerID="64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1" Apr 23 16:39:32.344189 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.344171 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:32.347909 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.347894 2571 scope.go:117] "RemoveContainer" containerID="c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0" Apr 23 16:39:32.356272 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.356257 2571 scope.go:117] "RemoveContainer" containerID="03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5" Apr 23 16:39:32.362749 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.362728 2571 scope.go:117] "RemoveContainer" containerID="a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23" Apr 23 16:39:32.363080 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:39:32.363054 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23\": container with ID starting with a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23 not found: ID does not exist" containerID="a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23" Apr 23 16:39:32.363144 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.363085 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23"} err="failed to get container status \"a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23\": rpc error: code = NotFound desc = could not find container \"a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23\": container with ID starting with a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23 not found: ID does not exist" Apr 23 16:39:32.363144 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.363116 2571 scope.go:117] "RemoveContainer" containerID="a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6" Apr 23 16:39:32.363337 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:39:32.363322 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6\": container with ID starting with a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6 not found: ID does not exist" containerID="a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6" Apr 23 16:39:32.363388 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.363341 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6"} err="failed to get container status \"a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6\": rpc error: code = NotFound desc = could not find container \"a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6\": container with ID starting with a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6 not found: ID does not exist" Apr 23 16:39:32.363388 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.363360 2571 scope.go:117] "RemoveContainer" containerID="7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e" Apr 23 16:39:32.363598 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:39:32.363579 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e\": container with ID starting with 7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e not found: ID does not exist" containerID="7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e" Apr 23 16:39:32.363645 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.363603 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e"} err="failed to get container status \"7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e\": rpc error: code = NotFound desc = could not find container \"7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e\": container with ID starting with 7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e not found: ID does not exist" Apr 23 16:39:32.363645 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.363620 2571 scope.go:117] "RemoveContainer" containerID="0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073" Apr 23 16:39:32.363852 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:39:32.363833 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073\": container with ID starting with 0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073 not found: ID does not exist" containerID="0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073" Apr 23 16:39:32.363895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.363856 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073"} err="failed to get container status \"0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073\": rpc error: code = NotFound desc = could not find container \"0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073\": container with ID starting with 0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073 not found: ID does not exist" Apr 23 16:39:32.363895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.363871 2571 scope.go:117] "RemoveContainer" containerID="64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1" Apr 23 16:39:32.364065 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:39:32.364050 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1\": container with ID starting with 64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1 not found: ID does not exist" containerID="64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1" Apr 23 16:39:32.364104 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364069 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1"} err="failed to get container status \"64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1\": rpc error: code = NotFound desc = could not find container \"64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1\": container with ID starting with 64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1 not found: ID does not exist" Apr 23 16:39:32.364104 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364083 2571 scope.go:117] "RemoveContainer" containerID="c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0" Apr 23 16:39:32.364288 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:39:32.364274 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0\": container with ID starting with c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0 not found: ID does not exist" containerID="c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0" Apr 23 16:39:32.364333 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364290 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0"} err="failed to get container status \"c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0\": rpc error: code = NotFound desc = could not find container \"c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0\": container with ID starting with c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0 not found: ID does not exist" Apr 23 16:39:32.364333 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364301 2571 scope.go:117] "RemoveContainer" containerID="03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5" Apr 23 16:39:32.364507 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:39:32.364489 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5\": container with ID starting with 03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5 not found: ID does not exist" containerID="03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5" Apr 23 16:39:32.364543 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364510 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5"} err="failed to get container status \"03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5\": rpc error: code = NotFound desc = could not find container \"03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5\": container with ID starting with 03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5 not found: ID does not exist" Apr 23 16:39:32.364543 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364520 2571 scope.go:117] "RemoveContainer" containerID="a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23" Apr 23 16:39:32.364690 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364674 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23"} err="failed to get container status \"a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23\": rpc error: code = NotFound desc = could not find container \"a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23\": container with ID starting with a6e1be211ea775d3f84e6dbbeb223a6584767849cfc5ed49c6033d3000a87a23 not found: ID does not exist" Apr 23 16:39:32.364690 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364690 2571 scope.go:117] "RemoveContainer" containerID="a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6" Apr 23 16:39:32.364863 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364848 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6"} err="failed to get container status \"a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6\": rpc error: code = NotFound desc = could not find container \"a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6\": container with ID starting with a38c3f084c1f23c94d19cc94c4e7507e1559b7eaea67aec15c278ab36befadd6 not found: ID does not exist" Apr 23 16:39:32.364901 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364862 2571 scope.go:117] "RemoveContainer" containerID="7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e" Apr 23 16:39:32.365014 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.364999 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e"} err="failed to get container status \"7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e\": rpc error: code = NotFound desc = could not find container \"7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e\": container with ID starting with 7bc9def75708d8579f71c463b40a34103d9a255aaf6df6e57561643a9f17601e not found: ID does not exist" Apr 23 16:39:32.365014 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.365013 2571 scope.go:117] "RemoveContainer" containerID="0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073" Apr 23 16:39:32.365166 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.365150 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073"} err="failed to get container status \"0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073\": rpc error: code = NotFound desc = could not find container \"0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073\": container with ID starting with 0065ba9517a176f865ba3adfd1e4f482b6552aeee8fda7d6bb753e072c720073 not found: ID does not exist" Apr 23 16:39:32.365205 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.365166 2571 scope.go:117] "RemoveContainer" containerID="64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1" Apr 23 16:39:32.365315 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.365301 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1"} err="failed to get container status \"64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1\": rpc error: code = NotFound desc = could not find container \"64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1\": container with ID starting with 64aaafacfe9a6f5db171caed79b698f43cbdcff79d0a2a704dc1bda89fbde1d1 not found: ID does not exist" Apr 23 16:39:32.365353 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.365315 2571 scope.go:117] "RemoveContainer" containerID="c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0" Apr 23 16:39:32.365496 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.365477 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0"} err="failed to get container status \"c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0\": rpc error: code = NotFound desc = could not find container \"c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0\": container with ID starting with c1a0d0499a03046bec9a1d1d35a03cab9338317c6dac7b0e8a3413357c949de0 not found: ID does not exist" Apr 23 16:39:32.365535 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.365499 2571 scope.go:117] "RemoveContainer" containerID="03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5" Apr 23 16:39:32.365664 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.365650 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5"} err="failed to get container status \"03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5\": rpc error: code = NotFound desc = could not find container \"03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5\": container with ID starting with 03d5c1d59c4975c0aee4163caba2f0d6f30c9bc9de265fe17e32dbdc066fb5e5 not found: ID does not exist" Apr 23 16:39:32.371425 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371407 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:32.371709 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371684 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy-metric" Apr 23 16:39:32.371753 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371720 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy-metric" Apr 23 16:39:32.371753 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371730 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="prom-label-proxy" Apr 23 16:39:32.371753 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371735 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="prom-label-proxy" Apr 23 16:39:32.371753 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371742 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="alertmanager" Apr 23 16:39:32.371753 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371748 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="alertmanager" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371755 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="config-reloader" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371760 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="config-reloader" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371768 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371773 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371782 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy-web" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371791 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy-web" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371801 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="init-config-reloader" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371805 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="init-config-reloader" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371849 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="config-reloader" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371859 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371870 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="alertmanager" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371877 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="prom-label-proxy" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371883 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy-web" Apr 23 16:39:32.371895 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.371888 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" containerName="kube-rbac-proxy-metric" Apr 23 16:39:32.375479 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.375463 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.378499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.378317 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 16:39:32.378499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.378337 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 16:39:32.378499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.378343 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4hrhq\"" Apr 23 16:39:32.378499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.378354 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 16:39:32.378499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.378397 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 16:39:32.378499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.378403 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 16:39:32.378499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.378329 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 16:39:32.378499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.378307 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 16:39:32.378903 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.378724 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 16:39:32.383794 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.383665 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 16:39:32.390835 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.390788 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:32.438784 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.438757 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.438784 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.438783 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-web-config\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.438933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.438799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b312748-82f4-4085-bc72-4d5fc5694a9b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.438933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.438848 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.438933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.438882 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.438933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.438916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b312748-82f4-4085-bc72-4d5fc5694a9b-config-out\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.439079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.438947 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-config-volume\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.439079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.438963 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b312748-82f4-4085-bc72-4d5fc5694a9b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.439079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.438979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.439079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.439002 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.439079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.439017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1b312748-82f4-4085-bc72-4d5fc5694a9b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.439247 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.439085 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b312748-82f4-4085-bc72-4d5fc5694a9b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.439247 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.439120 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76nnt\" (UniqueName: \"kubernetes.io/projected/1b312748-82f4-4085-bc72-4d5fc5694a9b-kube-api-access-76nnt\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.539899 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.539871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.539899 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.539902 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-web-config\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540140 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.539930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b312748-82f4-4085-bc72-4d5fc5694a9b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540140 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540140 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540289 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540183 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b312748-82f4-4085-bc72-4d5fc5694a9b-config-out\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540289 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-config-volume\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540289 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b312748-82f4-4085-bc72-4d5fc5694a9b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540289 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540261 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540472 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540296 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540472 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540322 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1b312748-82f4-4085-bc72-4d5fc5694a9b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540472 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b312748-82f4-4085-bc72-4d5fc5694a9b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540472 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76nnt\" (UniqueName: \"kubernetes.io/projected/1b312748-82f4-4085-bc72-4d5fc5694a9b-kube-api-access-76nnt\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.540981 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.540953 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b312748-82f4-4085-bc72-4d5fc5694a9b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.542321 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.541267 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1b312748-82f4-4085-bc72-4d5fc5694a9b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.542321 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.542256 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b312748-82f4-4085-bc72-4d5fc5694a9b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.543113 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.543052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-web-config\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.543250 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.543224 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b312748-82f4-4085-bc72-4d5fc5694a9b-config-out\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.543359 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.543316 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.543601 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.543577 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.543757 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.543740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b312748-82f4-4085-bc72-4d5fc5694a9b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.544017 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.543998 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-config-volume\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.544197 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.544174 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.544812 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.544794 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.545038 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.545021 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1b312748-82f4-4085-bc72-4d5fc5694a9b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.549445 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.549427 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76nnt\" (UniqueName: \"kubernetes.io/projected/1b312748-82f4-4085-bc72-4d5fc5694a9b-kube-api-access-76nnt\") pod \"alertmanager-main-0\" (UID: \"1b312748-82f4-4085-bc72-4d5fc5694a9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.686245 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.686155 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:32.811759 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:32.811730 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:32.814852 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:39:32.814822 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b312748_82f4_4085_bc72_4d5fc5694a9b.slice/crio-8d483a50ecf263c87cc9ae2cdc7b9d8775de6583a701638e50be600d39e6d3a2 WatchSource:0}: Error finding container 8d483a50ecf263c87cc9ae2cdc7b9d8775de6583a701638e50be600d39e6d3a2: Status 404 returned error can't find the container with id 8d483a50ecf263c87cc9ae2cdc7b9d8775de6583a701638e50be600d39e6d3a2 Apr 23 16:39:33.317984 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:33.317948 2571 generic.go:358] "Generic (PLEG): container finished" podID="1b312748-82f4-4085-bc72-4d5fc5694a9b" containerID="932e23ceadf665ba0e69df8f5cab33385ef60d2c869652e878a2f8c34e7db811" exitCode=0 Apr 23 16:39:33.317984 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:33.317984 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1b312748-82f4-4085-bc72-4d5fc5694a9b","Type":"ContainerDied","Data":"932e23ceadf665ba0e69df8f5cab33385ef60d2c869652e878a2f8c34e7db811"} Apr 23 16:39:33.318505 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:33.318005 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1b312748-82f4-4085-bc72-4d5fc5694a9b","Type":"ContainerStarted","Data":"8d483a50ecf263c87cc9ae2cdc7b9d8775de6583a701638e50be600d39e6d3a2"} Apr 23 16:39:33.512327 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:33.512302 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e6a68c5-0ae9-481c-94a5-b786ece7db4e" path="/var/lib/kubelet/pods/1e6a68c5-0ae9-481c-94a5-b786ece7db4e/volumes" Apr 23 16:39:34.323831 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.323798 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1b312748-82f4-4085-bc72-4d5fc5694a9b","Type":"ContainerStarted","Data":"a35bbe6df1e397c70fab2d121fc96cc910e4e890bbac29e70a3be0293573b6dc"} Apr 23 16:39:34.323831 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.323834 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1b312748-82f4-4085-bc72-4d5fc5694a9b","Type":"ContainerStarted","Data":"73175a16ff65f06c71cd0816264a2252ce3480c5849d240e19f95a1cf16d52f4"} Apr 23 16:39:34.324305 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.323845 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1b312748-82f4-4085-bc72-4d5fc5694a9b","Type":"ContainerStarted","Data":"9bfe06ac620db46e241613e39b04ef7ae0783ce0192a5c5baf24cd0b10bfe910"} Apr 23 16:39:34.324305 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.323853 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1b312748-82f4-4085-bc72-4d5fc5694a9b","Type":"ContainerStarted","Data":"8a52f79549dbd72f3464daaffdea70a9edfa286d032195efc9216045ed13114b"} Apr 23 16:39:34.324305 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.323861 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1b312748-82f4-4085-bc72-4d5fc5694a9b","Type":"ContainerStarted","Data":"7edabf0331abed842ac0e93d101f349b972e31a41a50141aeff09e3fa2d60d9d"} Apr 23 16:39:34.324305 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.323869 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1b312748-82f4-4085-bc72-4d5fc5694a9b","Type":"ContainerStarted","Data":"2766f177da79be7e8b5306f354f3ba72063904b6cf8004bfbb4d2469bce8558e"} Apr 23 16:39:34.355573 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.355519 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.355502063 podStartE2EDuration="2.355502063s" podCreationTimestamp="2026-04-23 16:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:39:34.352791246 +0000 UTC m=+257.434233817" watchObservedRunningTime="2026-04-23 16:39:34.355502063 +0000 UTC m=+257.436944633" Apr 23 16:39:34.867494 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.867452 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m"] Apr 23 16:39:34.870794 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.870770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:34.874989 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.874963 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 16:39:34.874989 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.874978 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 16:39:34.875182 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.875053 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 16:39:34.875182 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.875103 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 16:39:34.875598 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.875464 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-rzhlc\"" Apr 23 16:39:34.875598 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.875483 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 16:39:34.882763 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.882742 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 16:39:34.888377 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.888350 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m"] Apr 23 16:39:34.959492 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.959453 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f57aef30-fe57-482d-ad55-41be2b64efec-serving-certs-ca-bundle\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:34.959668 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.959504 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:34.959668 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.959577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-secret-telemeter-client\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:34.959668 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.959602 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59rs5\" (UniqueName: \"kubernetes.io/projected/f57aef30-fe57-482d-ad55-41be2b64efec-kube-api-access-59rs5\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:34.959668 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.959658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-federate-client-tls\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:34.959876 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.959731 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-telemeter-client-tls\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:34.959876 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.959782 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f57aef30-fe57-482d-ad55-41be2b64efec-metrics-client-ca\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:34.959876 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:34.959840 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f57aef30-fe57-482d-ad55-41be2b64efec-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.060763 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.060720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.060936 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.060777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-secret-telemeter-client\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.060936 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.060805 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59rs5\" (UniqueName: \"kubernetes.io/projected/f57aef30-fe57-482d-ad55-41be2b64efec-kube-api-access-59rs5\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.060936 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.060855 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-federate-client-tls\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.060936 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.060882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-telemeter-client-tls\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.060936 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.060925 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f57aef30-fe57-482d-ad55-41be2b64efec-metrics-client-ca\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.061197 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.060990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f57aef30-fe57-482d-ad55-41be2b64efec-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.061197 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.061016 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f57aef30-fe57-482d-ad55-41be2b64efec-serving-certs-ca-bundle\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.061771 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.061744 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f57aef30-fe57-482d-ad55-41be2b64efec-serving-certs-ca-bundle\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.062129 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.062080 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f57aef30-fe57-482d-ad55-41be2b64efec-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.062221 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.062137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f57aef30-fe57-482d-ad55-41be2b64efec-metrics-client-ca\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.063824 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.063796 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-telemeter-client-tls\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.063930 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.063899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-secret-telemeter-client\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.063995 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.063927 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.064110 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.064092 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f57aef30-fe57-482d-ad55-41be2b64efec-federate-client-tls\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.070686 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.070374 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59rs5\" (UniqueName: \"kubernetes.io/projected/f57aef30-fe57-482d-ad55-41be2b64efec-kube-api-access-59rs5\") pod \"telemeter-client-6b6f8cdb5f-ng72m\" (UID: \"f57aef30-fe57-482d-ad55-41be2b64efec\") " pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.182047 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.181958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" Apr 23 16:39:35.324574 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:35.324547 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m"] Apr 23 16:39:35.329001 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:39:35.328962 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf57aef30_fe57_482d_ad55_41be2b64efec.slice/crio-cc331fb68a85d4f79265baf9317789fa840437de994151dea4d4428d3ef6313f WatchSource:0}: Error finding container cc331fb68a85d4f79265baf9317789fa840437de994151dea4d4428d3ef6313f: Status 404 returned error can't find the container with id cc331fb68a85d4f79265baf9317789fa840437de994151dea4d4428d3ef6313f Apr 23 16:39:36.335457 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:36.335377 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" event={"ID":"f57aef30-fe57-482d-ad55-41be2b64efec","Type":"ContainerStarted","Data":"cc331fb68a85d4f79265baf9317789fa840437de994151dea4d4428d3ef6313f"} Apr 23 16:39:37.340340 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:37.340264 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" event={"ID":"f57aef30-fe57-482d-ad55-41be2b64efec","Type":"ContainerStarted","Data":"13444b2f3ab997d6d9a5d9bfadd84628927066fcb217e1ffe772e4d0e8c999d5"} Apr 23 16:39:37.340340 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:37.340303 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" event={"ID":"f57aef30-fe57-482d-ad55-41be2b64efec","Type":"ContainerStarted","Data":"e412a7a08945e99620124ea52284f2144396143e048fc0cf93a6bf95c16b6c38"} Apr 23 16:39:37.340340 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:37.340316 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" event={"ID":"f57aef30-fe57-482d-ad55-41be2b64efec","Type":"ContainerStarted","Data":"daf1808329b1beba04973c0613f2774bb7bf19a5205fdc6caa15356d616e0139"} Apr 23 16:39:37.380317 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:39:37.380271 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6b6f8cdb5f-ng72m" podStartSLOduration=1.661671495 podStartE2EDuration="3.380257135s" podCreationTimestamp="2026-04-23 16:39:34 +0000 UTC" firstStartedPulling="2026-04-23 16:39:35.331141644 +0000 UTC m=+258.412584190" lastFinishedPulling="2026-04-23 16:39:37.049727279 +0000 UTC m=+260.131169830" observedRunningTime="2026-04-23 16:39:37.377587609 +0000 UTC m=+260.459030178" watchObservedRunningTime="2026-04-23 16:39:37.380257135 +0000 UTC m=+260.461699703" Apr 23 16:40:17.387754 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:40:17.387727 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:40:17.388175 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:40:17.387736 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:40:17.402966 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:40:17.402945 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:41:28.031532 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.031414 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gnw5g"] Apr 23 16:41:28.034770 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.034750 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.038393 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.038373 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:41:28.056939 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.056919 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gnw5g"] Apr 23 16:41:28.124296 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.124266 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/47cef5f9-f168-4bee-ad96-bcf87f6d22e1-kubelet-config\") pod \"global-pull-secret-syncer-gnw5g\" (UID: \"47cef5f9-f168-4bee-ad96-bcf87f6d22e1\") " pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.124296 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.124298 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/47cef5f9-f168-4bee-ad96-bcf87f6d22e1-dbus\") pod \"global-pull-secret-syncer-gnw5g\" (UID: \"47cef5f9-f168-4bee-ad96-bcf87f6d22e1\") " pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.124501 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.124320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/47cef5f9-f168-4bee-ad96-bcf87f6d22e1-original-pull-secret\") pod \"global-pull-secret-syncer-gnw5g\" (UID: \"47cef5f9-f168-4bee-ad96-bcf87f6d22e1\") " pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.224751 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.224718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/47cef5f9-f168-4bee-ad96-bcf87f6d22e1-kubelet-config\") pod \"global-pull-secret-syncer-gnw5g\" (UID: \"47cef5f9-f168-4bee-ad96-bcf87f6d22e1\") " pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.224751 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.224750 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/47cef5f9-f168-4bee-ad96-bcf87f6d22e1-dbus\") pod \"global-pull-secret-syncer-gnw5g\" (UID: \"47cef5f9-f168-4bee-ad96-bcf87f6d22e1\") " pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.224933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.224772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/47cef5f9-f168-4bee-ad96-bcf87f6d22e1-original-pull-secret\") pod \"global-pull-secret-syncer-gnw5g\" (UID: \"47cef5f9-f168-4bee-ad96-bcf87f6d22e1\") " pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.224933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.224849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/47cef5f9-f168-4bee-ad96-bcf87f6d22e1-kubelet-config\") pod \"global-pull-secret-syncer-gnw5g\" (UID: \"47cef5f9-f168-4bee-ad96-bcf87f6d22e1\") " pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.224933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.224918 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/47cef5f9-f168-4bee-ad96-bcf87f6d22e1-dbus\") pod \"global-pull-secret-syncer-gnw5g\" (UID: \"47cef5f9-f168-4bee-ad96-bcf87f6d22e1\") " pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.227245 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.227228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/47cef5f9-f168-4bee-ad96-bcf87f6d22e1-original-pull-secret\") pod \"global-pull-secret-syncer-gnw5g\" (UID: \"47cef5f9-f168-4bee-ad96-bcf87f6d22e1\") " pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.344330 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.344253 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gnw5g" Apr 23 16:41:28.467533 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.467505 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gnw5g"] Apr 23 16:41:28.469552 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:41:28.469524 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47cef5f9_f168_4bee_ad96_bcf87f6d22e1.slice/crio-d598fb0ca1a1843b326c119112bdabb40fa3bf6ecefbf12e844c99e6d847c181 WatchSource:0}: Error finding container d598fb0ca1a1843b326c119112bdabb40fa3bf6ecefbf12e844c99e6d847c181: Status 404 returned error can't find the container with id d598fb0ca1a1843b326c119112bdabb40fa3bf6ecefbf12e844c99e6d847c181 Apr 23 16:41:28.471160 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.471146 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:41:28.658683 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:28.658594 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gnw5g" event={"ID":"47cef5f9-f168-4bee-ad96-bcf87f6d22e1","Type":"ContainerStarted","Data":"d598fb0ca1a1843b326c119112bdabb40fa3bf6ecefbf12e844c99e6d847c181"} Apr 23 16:41:33.676211 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:33.676176 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gnw5g" event={"ID":"47cef5f9-f168-4bee-ad96-bcf87f6d22e1","Type":"ContainerStarted","Data":"ce045601979260581ea3c16908d32965e75876b411d51346881f7e662c66d69d"} Apr 23 16:41:33.692765 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:41:33.692712 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gnw5g" podStartSLOduration=1.4089754970000001 podStartE2EDuration="5.692679153s" podCreationTimestamp="2026-04-23 16:41:28 +0000 UTC" firstStartedPulling="2026-04-23 16:41:28.471271857 +0000 UTC m=+371.552714405" lastFinishedPulling="2026-04-23 16:41:32.754975511 +0000 UTC m=+375.836418061" observedRunningTime="2026-04-23 16:41:33.691381082 +0000 UTC m=+376.772823652" watchObservedRunningTime="2026-04-23 16:41:33.692679153 +0000 UTC m=+376.774121751" Apr 23 16:43:31.791188 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.791156 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8"] Apr 23 16:43:31.794344 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.794325 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" Apr 23 16:43:31.800427 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.800365 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:43:31.800973 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.800770 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 23 16:43:31.800973 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.800957 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-bp6xk\"" Apr 23 16:43:31.812975 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.812955 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8"] Apr 23 16:43:31.884787 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.884758 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f728c3a9-a2ad-4c2b-86f0-54a15514a8e9-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-8g9d8\" (UID: \"f728c3a9-a2ad-4c2b-86f0-54a15514a8e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" Apr 23 16:43:31.884912 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.884823 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njxsq\" (UniqueName: \"kubernetes.io/projected/f728c3a9-a2ad-4c2b-86f0-54a15514a8e9-kube-api-access-njxsq\") pod \"cert-manager-operator-controller-manager-54b9655956-8g9d8\" (UID: \"f728c3a9-a2ad-4c2b-86f0-54a15514a8e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" Apr 23 16:43:31.985959 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.985933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njxsq\" (UniqueName: \"kubernetes.io/projected/f728c3a9-a2ad-4c2b-86f0-54a15514a8e9-kube-api-access-njxsq\") pod \"cert-manager-operator-controller-manager-54b9655956-8g9d8\" (UID: \"f728c3a9-a2ad-4c2b-86f0-54a15514a8e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" Apr 23 16:43:31.986053 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.985987 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f728c3a9-a2ad-4c2b-86f0-54a15514a8e9-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-8g9d8\" (UID: \"f728c3a9-a2ad-4c2b-86f0-54a15514a8e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" Apr 23 16:43:31.986339 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.986324 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f728c3a9-a2ad-4c2b-86f0-54a15514a8e9-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-8g9d8\" (UID: \"f728c3a9-a2ad-4c2b-86f0-54a15514a8e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" Apr 23 16:43:31.997036 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:31.997017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njxsq\" (UniqueName: \"kubernetes.io/projected/f728c3a9-a2ad-4c2b-86f0-54a15514a8e9-kube-api-access-njxsq\") pod \"cert-manager-operator-controller-manager-54b9655956-8g9d8\" (UID: \"f728c3a9-a2ad-4c2b-86f0-54a15514a8e9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" Apr 23 16:43:32.103424 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:32.103361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" Apr 23 16:43:32.225656 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:32.225630 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8"] Apr 23 16:43:32.228343 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:43:32.228316 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf728c3a9_a2ad_4c2b_86f0_54a15514a8e9.slice/crio-69fff05bd7d25e7711c52cafe69c866f68b70dacd6e1ac21b2a7811f3a880108 WatchSource:0}: Error finding container 69fff05bd7d25e7711c52cafe69c866f68b70dacd6e1ac21b2a7811f3a880108: Status 404 returned error can't find the container with id 69fff05bd7d25e7711c52cafe69c866f68b70dacd6e1ac21b2a7811f3a880108 Apr 23 16:43:33.028916 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:33.028871 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" event={"ID":"f728c3a9-a2ad-4c2b-86f0-54a15514a8e9","Type":"ContainerStarted","Data":"69fff05bd7d25e7711c52cafe69c866f68b70dacd6e1ac21b2a7811f3a880108"} Apr 23 16:43:35.037756 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:35.037712 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" event={"ID":"f728c3a9-a2ad-4c2b-86f0-54a15514a8e9","Type":"ContainerStarted","Data":"1916465784d6b2ba81a4179f3e8e168fc5877a12f4ae57b0328122688ed862f0"} Apr 23 16:43:35.060486 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:35.060438 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8g9d8" podStartSLOduration=1.473885667 podStartE2EDuration="4.060424534s" podCreationTimestamp="2026-04-23 16:43:31 +0000 UTC" firstStartedPulling="2026-04-23 16:43:32.231035591 +0000 UTC m=+495.312478140" lastFinishedPulling="2026-04-23 16:43:34.817574453 +0000 UTC m=+497.899017007" observedRunningTime="2026-04-23 16:43:35.057990182 +0000 UTC m=+498.139432786" watchObservedRunningTime="2026-04-23 16:43:35.060424534 +0000 UTC m=+498.141867102" Apr 23 16:43:37.224274 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.224234 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-9nx9s"] Apr 23 16:43:37.227908 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.227887 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" Apr 23 16:43:37.232242 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.232220 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-gcvdn\"" Apr 23 16:43:37.233421 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.233247 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 16:43:37.233421 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.233338 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 16:43:37.242107 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.242076 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-9nx9s"] Apr 23 16:43:37.334894 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.334863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lfqp\" (UniqueName: \"kubernetes.io/projected/06933d24-a23f-43d1-81d4-073a0945a3c5-kube-api-access-8lfqp\") pod \"cert-manager-webhook-587ccfb98-9nx9s\" (UID: \"06933d24-a23f-43d1-81d4-073a0945a3c5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" Apr 23 16:43:37.335026 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.334921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/06933d24-a23f-43d1-81d4-073a0945a3c5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-9nx9s\" (UID: \"06933d24-a23f-43d1-81d4-073a0945a3c5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" Apr 23 16:43:37.436074 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.436044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lfqp\" (UniqueName: \"kubernetes.io/projected/06933d24-a23f-43d1-81d4-073a0945a3c5-kube-api-access-8lfqp\") pod \"cert-manager-webhook-587ccfb98-9nx9s\" (UID: \"06933d24-a23f-43d1-81d4-073a0945a3c5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" Apr 23 16:43:37.436234 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.436101 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/06933d24-a23f-43d1-81d4-073a0945a3c5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-9nx9s\" (UID: \"06933d24-a23f-43d1-81d4-073a0945a3c5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" Apr 23 16:43:37.446233 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.446207 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lfqp\" (UniqueName: \"kubernetes.io/projected/06933d24-a23f-43d1-81d4-073a0945a3c5-kube-api-access-8lfqp\") pod \"cert-manager-webhook-587ccfb98-9nx9s\" (UID: \"06933d24-a23f-43d1-81d4-073a0945a3c5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" Apr 23 16:43:37.446912 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.446894 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/06933d24-a23f-43d1-81d4-073a0945a3c5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-9nx9s\" (UID: \"06933d24-a23f-43d1-81d4-073a0945a3c5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" Apr 23 16:43:37.551424 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.551343 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" Apr 23 16:43:37.671423 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:37.671397 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-9nx9s"] Apr 23 16:43:37.673912 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:43:37.673879 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06933d24_a23f_43d1_81d4_073a0945a3c5.slice/crio-f4552e64ac1c7225b7fd69acca8c5ac262e4c6aa04a1c3b6a7d438482a84323b WatchSource:0}: Error finding container f4552e64ac1c7225b7fd69acca8c5ac262e4c6aa04a1c3b6a7d438482a84323b: Status 404 returned error can't find the container with id f4552e64ac1c7225b7fd69acca8c5ac262e4c6aa04a1c3b6a7d438482a84323b Apr 23 16:43:38.050223 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:38.050128 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" event={"ID":"06933d24-a23f-43d1-81d4-073a0945a3c5","Type":"ContainerStarted","Data":"f4552e64ac1c7225b7fd69acca8c5ac262e4c6aa04a1c3b6a7d438482a84323b"} Apr 23 16:43:41.061671 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:41.061637 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" event={"ID":"06933d24-a23f-43d1-81d4-073a0945a3c5","Type":"ContainerStarted","Data":"574cdce04abc6851a5be39461b15079f42ffc81f9146cd99857646244e8b8185"} Apr 23 16:43:41.062042 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:41.061704 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" Apr 23 16:43:41.082918 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:41.082866 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" podStartSLOduration=1.253749975 podStartE2EDuration="4.082852579s" podCreationTimestamp="2026-04-23 16:43:37 +0000 UTC" firstStartedPulling="2026-04-23 16:43:37.675751702 +0000 UTC m=+500.757194253" lastFinishedPulling="2026-04-23 16:43:40.504854296 +0000 UTC m=+503.586296857" observedRunningTime="2026-04-23 16:43:41.080374274 +0000 UTC m=+504.161816843" watchObservedRunningTime="2026-04-23 16:43:41.082852579 +0000 UTC m=+504.164295148" Apr 23 16:43:43.261087 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.261051 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-2lrrd"] Apr 23 16:43:43.264435 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.264419 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" Apr 23 16:43:43.267254 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.267230 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-nq6jp\"" Apr 23 16:43:43.271602 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.271484 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-2lrrd"] Apr 23 16:43:43.390388 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.390349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/863cce29-3066-438b-a96e-eebc3b80faa2-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-2lrrd\" (UID: \"863cce29-3066-438b-a96e-eebc3b80faa2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" Apr 23 16:43:43.390555 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.390416 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mpjf\" (UniqueName: \"kubernetes.io/projected/863cce29-3066-438b-a96e-eebc3b80faa2-kube-api-access-2mpjf\") pod \"cert-manager-cainjector-68b757865b-2lrrd\" (UID: \"863cce29-3066-438b-a96e-eebc3b80faa2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" Apr 23 16:43:43.491369 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.491323 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/863cce29-3066-438b-a96e-eebc3b80faa2-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-2lrrd\" (UID: \"863cce29-3066-438b-a96e-eebc3b80faa2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" Apr 23 16:43:43.491519 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.491400 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mpjf\" (UniqueName: \"kubernetes.io/projected/863cce29-3066-438b-a96e-eebc3b80faa2-kube-api-access-2mpjf\") pod \"cert-manager-cainjector-68b757865b-2lrrd\" (UID: \"863cce29-3066-438b-a96e-eebc3b80faa2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" Apr 23 16:43:43.501942 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.501915 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/863cce29-3066-438b-a96e-eebc3b80faa2-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-2lrrd\" (UID: \"863cce29-3066-438b-a96e-eebc3b80faa2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" Apr 23 16:43:43.502065 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.502014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mpjf\" (UniqueName: \"kubernetes.io/projected/863cce29-3066-438b-a96e-eebc3b80faa2-kube-api-access-2mpjf\") pod \"cert-manager-cainjector-68b757865b-2lrrd\" (UID: \"863cce29-3066-438b-a96e-eebc3b80faa2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" Apr 23 16:43:43.574463 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.574382 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" Apr 23 16:43:43.696626 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:43.696601 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-2lrrd"] Apr 23 16:43:43.698626 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:43:43.698584 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod863cce29_3066_438b_a96e_eebc3b80faa2.slice/crio-057bd81d14944b815654c8a99d60e18c7f7f959e6033c7032754ad9ce1d80c62 WatchSource:0}: Error finding container 057bd81d14944b815654c8a99d60e18c7f7f959e6033c7032754ad9ce1d80c62: Status 404 returned error can't find the container with id 057bd81d14944b815654c8a99d60e18c7f7f959e6033c7032754ad9ce1d80c62 Apr 23 16:43:44.075122 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:44.075088 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" event={"ID":"863cce29-3066-438b-a96e-eebc3b80faa2","Type":"ContainerStarted","Data":"790092ef0916dbd4276f82ef71b2fc9da525c437f8b804e98582728109ee18de"} Apr 23 16:43:44.075122 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:44.075124 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" event={"ID":"863cce29-3066-438b-a96e-eebc3b80faa2","Type":"ContainerStarted","Data":"057bd81d14944b815654c8a99d60e18c7f7f959e6033c7032754ad9ce1d80c62"} Apr 23 16:43:44.094067 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:44.094013 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-2lrrd" podStartSLOduration=1.093996793 podStartE2EDuration="1.093996793s" podCreationTimestamp="2026-04-23 16:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:43:44.092071001 +0000 UTC m=+507.173513568" watchObservedRunningTime="2026-04-23 16:43:44.093996793 +0000 UTC m=+507.175439361" Apr 23 16:43:47.068399 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:43:47.068369 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-9nx9s" Apr 23 16:44:10.086414 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.086304 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf"] Apr 23 16:44:10.094219 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.094186 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.098674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.098440 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 23 16:44:10.098674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.098476 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 23 16:44:10.098674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.098554 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:44:10.098951 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.098794 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 23 16:44:10.099024 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.099001 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 23 16:44:10.099222 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.099143 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-zl9f4\"" Apr 23 16:44:10.101831 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.101806 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf"] Apr 23 16:44:10.221747 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.221687 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3b9f39e9-424e-413a-be6b-cbccba14148c-manager-config\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.221925 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.221760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm4n4\" (UniqueName: \"kubernetes.io/projected/3b9f39e9-424e-413a-be6b-cbccba14148c-kube-api-access-pm4n4\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.221925 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.221829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b9f39e9-424e-413a-be6b-cbccba14148c-metrics-cert\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.221925 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.221888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b9f39e9-424e-413a-be6b-cbccba14148c-cert\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.323129 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.323083 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b9f39e9-424e-413a-be6b-cbccba14148c-metrics-cert\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.323129 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.323136 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b9f39e9-424e-413a-be6b-cbccba14148c-cert\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.323346 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.323177 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3b9f39e9-424e-413a-be6b-cbccba14148c-manager-config\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.323346 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.323194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm4n4\" (UniqueName: \"kubernetes.io/projected/3b9f39e9-424e-413a-be6b-cbccba14148c-kube-api-access-pm4n4\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.323956 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.323931 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3b9f39e9-424e-413a-be6b-cbccba14148c-manager-config\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.325887 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.325868 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b9f39e9-424e-413a-be6b-cbccba14148c-cert\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.326014 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.325994 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b9f39e9-424e-413a-be6b-cbccba14148c-metrics-cert\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.336072 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.336051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm4n4\" (UniqueName: \"kubernetes.io/projected/3b9f39e9-424e-413a-be6b-cbccba14148c-kube-api-access-pm4n4\") pod \"lws-controller-manager-b454c4fb-vqlsf\" (UID: \"3b9f39e9-424e-413a-be6b-cbccba14148c\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.406853 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.406770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:10.547061 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:10.547036 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf"] Apr 23 16:44:10.549262 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:44:10.549234 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b9f39e9_424e_413a_be6b_cbccba14148c.slice/crio-d5e40f393d4f32e91f8e2974b47684dd6ab6fd909664c969e3b51581879ee091 WatchSource:0}: Error finding container d5e40f393d4f32e91f8e2974b47684dd6ab6fd909664c969e3b51581879ee091: Status 404 returned error can't find the container with id d5e40f393d4f32e91f8e2974b47684dd6ab6fd909664c969e3b51581879ee091 Apr 23 16:44:11.164469 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:11.164431 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" event={"ID":"3b9f39e9-424e-413a-be6b-cbccba14148c","Type":"ContainerStarted","Data":"d5e40f393d4f32e91f8e2974b47684dd6ab6fd909664c969e3b51581879ee091"} Apr 23 16:44:14.176248 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:14.176210 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" event={"ID":"3b9f39e9-424e-413a-be6b-cbccba14148c","Type":"ContainerStarted","Data":"986c1bb827c10f6a8224ffb09aad1efc3655fc24294209340306ce57518247fa"} Apr 23 16:44:14.176720 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:14.176333 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:14.195526 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:14.195482 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" podStartSLOduration=1.595623785 podStartE2EDuration="4.195466593s" podCreationTimestamp="2026-04-23 16:44:10 +0000 UTC" firstStartedPulling="2026-04-23 16:44:10.55110207 +0000 UTC m=+533.632544618" lastFinishedPulling="2026-04-23 16:44:13.150944879 +0000 UTC m=+536.232387426" observedRunningTime="2026-04-23 16:44:14.194272287 +0000 UTC m=+537.275714857" watchObservedRunningTime="2026-04-23 16:44:14.195466593 +0000 UTC m=+537.276909163" Apr 23 16:44:25.181916 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:25.181878 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-vqlsf" Apr 23 16:44:55.831505 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.831460 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d"] Apr 23 16:44:55.837083 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.837060 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:55.840098 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.840072 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 23 16:44:55.840240 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.840176 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-bljkv\"" Apr 23 16:44:55.847484 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.847461 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d"] Apr 23 16:44:55.932571 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.932540 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:55.932571 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.932571 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:55.932809 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.932594 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:55.932809 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.932612 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:55.932809 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.932741 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:55.932809 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.932769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:55.932809 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.932801 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:55.932975 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.932818 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:55.932975 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:55.932832 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmhr8\" (UniqueName: \"kubernetes.io/projected/59b49f8a-980e-4e40-a688-5df2c05a7ba3-kube-api-access-jmhr8\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.033942 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.033910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034078 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.033944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034078 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.033968 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034078 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.033993 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034078 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.034041 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034078 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.034069 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034336 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.034105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034336 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.034146 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034336 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.034173 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmhr8\" (UniqueName: \"kubernetes.io/projected/59b49f8a-980e-4e40-a688-5df2c05a7ba3-kube-api-access-jmhr8\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034482 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.034339 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034538 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.034492 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.034596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.034664 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.034613 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.035073 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.035053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.036513 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.036493 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.036637 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.036619 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.042063 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.042033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/59b49f8a-980e-4e40-a688-5df2c05a7ba3-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.042285 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.042267 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmhr8\" (UniqueName: \"kubernetes.io/projected/59b49f8a-980e-4e40-a688-5df2c05a7ba3-kube-api-access-jmhr8\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d\" (UID: \"59b49f8a-980e-4e40-a688-5df2c05a7ba3\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.150171 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.150080 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:44:56.273419 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.273388 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d"] Apr 23 16:44:56.276539 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:44:56.276510 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59b49f8a_980e_4e40_a688_5df2c05a7ba3.slice/crio-94116ce4b17f5bcb88aee722f6b5ffde7cb13cd75b9178e3e3dff72dce4080ce WatchSource:0}: Error finding container 94116ce4b17f5bcb88aee722f6b5ffde7cb13cd75b9178e3e3dff72dce4080ce: Status 404 returned error can't find the container with id 94116ce4b17f5bcb88aee722f6b5ffde7cb13cd75b9178e3e3dff72dce4080ce Apr 23 16:44:56.314721 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:56.314664 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" event={"ID":"59b49f8a-980e-4e40-a688-5df2c05a7ba3","Type":"ContainerStarted","Data":"94116ce4b17f5bcb88aee722f6b5ffde7cb13cd75b9178e3e3dff72dce4080ce"} Apr 23 16:44:59.263012 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:59.262874 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 23 16:44:59.263282 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:59.263066 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 23 16:44:59.263282 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:59.263106 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 23 16:44:59.326093 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:59.326062 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" event={"ID":"59b49f8a-980e-4e40-a688-5df2c05a7ba3","Type":"ContainerStarted","Data":"99a30f0a99703ccb78fb6a9b6b7c9e310c4065f7105524732889baa9f1875fea"} Apr 23 16:44:59.353300 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:44:59.353235 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" podStartSLOduration=1.36904588 podStartE2EDuration="4.353215673s" podCreationTimestamp="2026-04-23 16:44:55 +0000 UTC" firstStartedPulling="2026-04-23 16:44:56.278405235 +0000 UTC m=+579.359847782" lastFinishedPulling="2026-04-23 16:44:59.262575016 +0000 UTC m=+582.344017575" observedRunningTime="2026-04-23 16:44:59.350363739 +0000 UTC m=+582.431806307" watchObservedRunningTime="2026-04-23 16:44:59.353215673 +0000 UTC m=+582.434658243" Apr 23 16:45:00.150339 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:00.150247 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:45:00.154837 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:00.154813 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:45:00.329245 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:00.329218 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:45:00.330148 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:00.330129 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d" Apr 23 16:45:17.418619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:17.418594 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:45:17.419053 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:17.418786 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:45:22.716516 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:22.716480 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6"] Apr 23 16:45:22.724336 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:22.724313 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6" Apr 23 16:45:22.727081 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:22.727014 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-6l8gb\"" Apr 23 16:45:22.727183 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:22.727112 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 16:45:22.727270 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:22.727248 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 16:45:22.728300 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:22.728276 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6"] Apr 23 16:45:22.879822 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:22.879789 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcrcc\" (UniqueName: \"kubernetes.io/projected/43ae2150-33ea-4e3b-9b53-b4f92e8c2879-kube-api-access-mcrcc\") pod \"limitador-operator-controller-manager-c7fb4c8d5-4x4v6\" (UID: \"43ae2150-33ea-4e3b-9b53-b4f92e8c2879\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6" Apr 23 16:45:22.981222 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:22.981138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcrcc\" (UniqueName: \"kubernetes.io/projected/43ae2150-33ea-4e3b-9b53-b4f92e8c2879-kube-api-access-mcrcc\") pod \"limitador-operator-controller-manager-c7fb4c8d5-4x4v6\" (UID: \"43ae2150-33ea-4e3b-9b53-b4f92e8c2879\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6" Apr 23 16:45:22.993447 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:22.993417 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcrcc\" (UniqueName: \"kubernetes.io/projected/43ae2150-33ea-4e3b-9b53-b4f92e8c2879-kube-api-access-mcrcc\") pod \"limitador-operator-controller-manager-c7fb4c8d5-4x4v6\" (UID: \"43ae2150-33ea-4e3b-9b53-b4f92e8c2879\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6" Apr 23 16:45:23.036115 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:23.036091 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6" Apr 23 16:45:23.156396 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:23.156367 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6"] Apr 23 16:45:23.158268 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:45:23.158238 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43ae2150_33ea_4e3b_9b53_b4f92e8c2879.slice/crio-ecaa317cbbe819acf1ae86e784b241d1f21fb2983ce4c166603236c8db9aa9a3 WatchSource:0}: Error finding container ecaa317cbbe819acf1ae86e784b241d1f21fb2983ce4c166603236c8db9aa9a3: Status 404 returned error can't find the container with id ecaa317cbbe819acf1ae86e784b241d1f21fb2983ce4c166603236c8db9aa9a3 Apr 23 16:45:23.407618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:23.407521 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6" event={"ID":"43ae2150-33ea-4e3b-9b53-b4f92e8c2879","Type":"ContainerStarted","Data":"ecaa317cbbe819acf1ae86e784b241d1f21fb2983ce4c166603236c8db9aa9a3"} Apr 23 16:45:26.421136 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:26.421053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6" event={"ID":"43ae2150-33ea-4e3b-9b53-b4f92e8c2879","Type":"ContainerStarted","Data":"8c29452598b3d906c09226ef8fdbe07ccdb4e9be8f220a4f27089cc6df44f940"} Apr 23 16:45:26.421506 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:26.421160 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6" Apr 23 16:45:26.439057 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:26.438987 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6" podStartSLOduration=1.44001838 podStartE2EDuration="4.438970013s" podCreationTimestamp="2026-04-23 16:45:22 +0000 UTC" firstStartedPulling="2026-04-23 16:45:23.160454087 +0000 UTC m=+606.241896641" lastFinishedPulling="2026-04-23 16:45:26.159405727 +0000 UTC m=+609.240848274" observedRunningTime="2026-04-23 16:45:26.436932711 +0000 UTC m=+609.518375281" watchObservedRunningTime="2026-04-23 16:45:26.438970013 +0000 UTC m=+609.520412582" Apr 23 16:45:37.427240 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:45:37.427157 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-4x4v6" Apr 23 16:46:09.650825 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.650789 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-2sr6d"] Apr 23 16:46:09.653028 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.653007 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-2sr6d" Apr 23 16:46:09.656235 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.656212 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-qr67w\"" Apr 23 16:46:09.661979 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.661953 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-2sr6d"] Apr 23 16:46:09.787285 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.787254 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-vq8fm"] Apr 23 16:46:09.789609 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.789593 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-vq8fm" Apr 23 16:46:09.790783 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.790761 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqqwn\" (UniqueName: \"kubernetes.io/projected/119d93e9-69d3-48d0-979b-bf469c468b07-kube-api-access-wqqwn\") pod \"authorino-674b59b84c-2sr6d\" (UID: \"119d93e9-69d3-48d0-979b-bf469c468b07\") " pod="kuadrant-system/authorino-674b59b84c-2sr6d" Apr 23 16:46:09.797338 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.797312 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-vq8fm"] Apr 23 16:46:09.891998 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.891957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqqwn\" (UniqueName: \"kubernetes.io/projected/119d93e9-69d3-48d0-979b-bf469c468b07-kube-api-access-wqqwn\") pod \"authorino-674b59b84c-2sr6d\" (UID: \"119d93e9-69d3-48d0-979b-bf469c468b07\") " pod="kuadrant-system/authorino-674b59b84c-2sr6d" Apr 23 16:46:09.892172 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.892013 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4vcv\" (UniqueName: \"kubernetes.io/projected/9bc56d86-6c74-4a1f-a7d2-49efe68a9c97-kube-api-access-w4vcv\") pod \"authorino-79cbc94b89-vq8fm\" (UID: \"9bc56d86-6c74-4a1f-a7d2-49efe68a9c97\") " pod="kuadrant-system/authorino-79cbc94b89-vq8fm" Apr 23 16:46:09.900729 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.900681 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqqwn\" (UniqueName: \"kubernetes.io/projected/119d93e9-69d3-48d0-979b-bf469c468b07-kube-api-access-wqqwn\") pod \"authorino-674b59b84c-2sr6d\" (UID: \"119d93e9-69d3-48d0-979b-bf469c468b07\") " pod="kuadrant-system/authorino-674b59b84c-2sr6d" Apr 23 16:46:09.963783 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.963750 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-2sr6d" Apr 23 16:46:09.992904 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:09.992870 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4vcv\" (UniqueName: \"kubernetes.io/projected/9bc56d86-6c74-4a1f-a7d2-49efe68a9c97-kube-api-access-w4vcv\") pod \"authorino-79cbc94b89-vq8fm\" (UID: \"9bc56d86-6c74-4a1f-a7d2-49efe68a9c97\") " pod="kuadrant-system/authorino-79cbc94b89-vq8fm" Apr 23 16:46:10.002531 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:10.002504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4vcv\" (UniqueName: \"kubernetes.io/projected/9bc56d86-6c74-4a1f-a7d2-49efe68a9c97-kube-api-access-w4vcv\") pod \"authorino-79cbc94b89-vq8fm\" (UID: \"9bc56d86-6c74-4a1f-a7d2-49efe68a9c97\") " pod="kuadrant-system/authorino-79cbc94b89-vq8fm" Apr 23 16:46:10.089215 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:10.089191 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-2sr6d"] Apr 23 16:46:10.091182 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:46:10.091157 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod119d93e9_69d3_48d0_979b_bf469c468b07.slice/crio-89c3e8b63acf1dffa91fbe71da8db6c96e6a7d85f74fc8fc993f16a6e834f1e2 WatchSource:0}: Error finding container 89c3e8b63acf1dffa91fbe71da8db6c96e6a7d85f74fc8fc993f16a6e834f1e2: Status 404 returned error can't find the container with id 89c3e8b63acf1dffa91fbe71da8db6c96e6a7d85f74fc8fc993f16a6e834f1e2 Apr 23 16:46:10.099511 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:10.099492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-vq8fm" Apr 23 16:46:10.216283 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:10.216260 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-vq8fm"] Apr 23 16:46:10.217978 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:46:10.217945 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bc56d86_6c74_4a1f_a7d2_49efe68a9c97.slice/crio-f564d4601a5a7423fd104231b432c17edbde5845e3ceac28b99f2f44770e7a53 WatchSource:0}: Error finding container f564d4601a5a7423fd104231b432c17edbde5845e3ceac28b99f2f44770e7a53: Status 404 returned error can't find the container with id f564d4601a5a7423fd104231b432c17edbde5845e3ceac28b99f2f44770e7a53 Apr 23 16:46:10.568358 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:10.568253 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-vq8fm" event={"ID":"9bc56d86-6c74-4a1f-a7d2-49efe68a9c97","Type":"ContainerStarted","Data":"f564d4601a5a7423fd104231b432c17edbde5845e3ceac28b99f2f44770e7a53"} Apr 23 16:46:10.569294 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:10.569269 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-2sr6d" event={"ID":"119d93e9-69d3-48d0-979b-bf469c468b07","Type":"ContainerStarted","Data":"89c3e8b63acf1dffa91fbe71da8db6c96e6a7d85f74fc8fc993f16a6e834f1e2"} Apr 23 16:46:13.583381 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:13.583344 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-vq8fm" event={"ID":"9bc56d86-6c74-4a1f-a7d2-49efe68a9c97","Type":"ContainerStarted","Data":"fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7"} Apr 23 16:46:13.584628 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:13.584604 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-2sr6d" event={"ID":"119d93e9-69d3-48d0-979b-bf469c468b07","Type":"ContainerStarted","Data":"2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8"} Apr 23 16:46:13.598596 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:13.598544 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-vq8fm" podStartSLOduration=1.9302363489999999 podStartE2EDuration="4.598527604s" podCreationTimestamp="2026-04-23 16:46:09 +0000 UTC" firstStartedPulling="2026-04-23 16:46:10.219367763 +0000 UTC m=+653.300810311" lastFinishedPulling="2026-04-23 16:46:12.887659015 +0000 UTC m=+655.969101566" observedRunningTime="2026-04-23 16:46:13.598262949 +0000 UTC m=+656.679705530" watchObservedRunningTime="2026-04-23 16:46:13.598527604 +0000 UTC m=+656.679970169" Apr 23 16:46:13.614891 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:13.614836 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-2sr6d" podStartSLOduration=1.830274675 podStartE2EDuration="4.614818445s" podCreationTimestamp="2026-04-23 16:46:09 +0000 UTC" firstStartedPulling="2026-04-23 16:46:10.092359283 +0000 UTC m=+653.173801833" lastFinishedPulling="2026-04-23 16:46:12.876903053 +0000 UTC m=+655.958345603" observedRunningTime="2026-04-23 16:46:13.613986279 +0000 UTC m=+656.695428847" watchObservedRunningTime="2026-04-23 16:46:13.614818445 +0000 UTC m=+656.696261017" Apr 23 16:46:13.640595 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:13.640557 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-2sr6d"] Apr 23 16:46:15.591476 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:15.591431 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-2sr6d" podUID="119d93e9-69d3-48d0-979b-bf469c468b07" containerName="authorino" containerID="cri-o://2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8" gracePeriod=30 Apr 23 16:46:15.842783 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:15.842720 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-2sr6d" Apr 23 16:46:15.948294 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:15.948257 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqqwn\" (UniqueName: \"kubernetes.io/projected/119d93e9-69d3-48d0-979b-bf469c468b07-kube-api-access-wqqwn\") pod \"119d93e9-69d3-48d0-979b-bf469c468b07\" (UID: \"119d93e9-69d3-48d0-979b-bf469c468b07\") " Apr 23 16:46:15.950569 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:15.950532 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119d93e9-69d3-48d0-979b-bf469c468b07-kube-api-access-wqqwn" (OuterVolumeSpecName: "kube-api-access-wqqwn") pod "119d93e9-69d3-48d0-979b-bf469c468b07" (UID: "119d93e9-69d3-48d0-979b-bf469c468b07"). InnerVolumeSpecName "kube-api-access-wqqwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:46:16.049408 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:16.049361 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqqwn\" (UniqueName: \"kubernetes.io/projected/119d93e9-69d3-48d0-979b-bf469c468b07-kube-api-access-wqqwn\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:46:16.596550 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:16.596517 2571 generic.go:358] "Generic (PLEG): container finished" podID="119d93e9-69d3-48d0-979b-bf469c468b07" containerID="2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8" exitCode=0 Apr 23 16:46:16.596996 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:16.596590 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-2sr6d" Apr 23 16:46:16.596996 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:16.596600 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-2sr6d" event={"ID":"119d93e9-69d3-48d0-979b-bf469c468b07","Type":"ContainerDied","Data":"2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8"} Apr 23 16:46:16.596996 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:16.596638 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-2sr6d" event={"ID":"119d93e9-69d3-48d0-979b-bf469c468b07","Type":"ContainerDied","Data":"89c3e8b63acf1dffa91fbe71da8db6c96e6a7d85f74fc8fc993f16a6e834f1e2"} Apr 23 16:46:16.596996 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:16.596653 2571 scope.go:117] "RemoveContainer" containerID="2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8" Apr 23 16:46:16.605763 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:16.605738 2571 scope.go:117] "RemoveContainer" containerID="2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8" Apr 23 16:46:16.606086 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:46:16.606061 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8\": container with ID starting with 2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8 not found: ID does not exist" containerID="2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8" Apr 23 16:46:16.606263 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:16.606096 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8"} err="failed to get container status \"2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8\": rpc error: code = NotFound desc = could not find container \"2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8\": container with ID starting with 2d6fbbdcab9bf86fc1ce0e737fe7b278d3c2c5b2cf46cac832587871987ae5e8 not found: ID does not exist" Apr 23 16:46:16.620153 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:16.620117 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-2sr6d"] Apr 23 16:46:16.621723 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:16.621677 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-2sr6d"] Apr 23 16:46:17.511881 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:17.511846 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119d93e9-69d3-48d0-979b-bf469c468b07" path="/var/lib/kubelet/pods/119d93e9-69d3-48d0-979b-bf469c468b07/volumes" Apr 23 16:46:33.568315 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.568281 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-2lhrl"] Apr 23 16:46:33.568761 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.568646 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="119d93e9-69d3-48d0-979b-bf469c468b07" containerName="authorino" Apr 23 16:46:33.568761 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.568658 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="119d93e9-69d3-48d0-979b-bf469c468b07" containerName="authorino" Apr 23 16:46:33.568761 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.568725 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="119d93e9-69d3-48d0-979b-bf469c468b07" containerName="authorino" Apr 23 16:46:33.572145 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.572130 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-2lhrl" Apr 23 16:46:33.574836 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.574809 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 23 16:46:33.578975 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.578939 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-2lhrl"] Apr 23 16:46:33.606882 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.606853 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwrrz\" (UniqueName: \"kubernetes.io/projected/31965c06-acb6-474c-8ce9-ca34c0c2f2f3-kube-api-access-qwrrz\") pod \"authorino-68bd676465-2lhrl\" (UID: \"31965c06-acb6-474c-8ce9-ca34c0c2f2f3\") " pod="kuadrant-system/authorino-68bd676465-2lhrl" Apr 23 16:46:33.607036 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.606945 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31965c06-acb6-474c-8ce9-ca34c0c2f2f3-tls-cert\") pod \"authorino-68bd676465-2lhrl\" (UID: \"31965c06-acb6-474c-8ce9-ca34c0c2f2f3\") " pod="kuadrant-system/authorino-68bd676465-2lhrl" Apr 23 16:46:33.708308 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.708275 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwrrz\" (UniqueName: \"kubernetes.io/projected/31965c06-acb6-474c-8ce9-ca34c0c2f2f3-kube-api-access-qwrrz\") pod \"authorino-68bd676465-2lhrl\" (UID: \"31965c06-acb6-474c-8ce9-ca34c0c2f2f3\") " pod="kuadrant-system/authorino-68bd676465-2lhrl" Apr 23 16:46:33.708475 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.708373 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31965c06-acb6-474c-8ce9-ca34c0c2f2f3-tls-cert\") pod \"authorino-68bd676465-2lhrl\" (UID: \"31965c06-acb6-474c-8ce9-ca34c0c2f2f3\") " pod="kuadrant-system/authorino-68bd676465-2lhrl" Apr 23 16:46:33.710968 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.710944 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31965c06-acb6-474c-8ce9-ca34c0c2f2f3-tls-cert\") pod \"authorino-68bd676465-2lhrl\" (UID: \"31965c06-acb6-474c-8ce9-ca34c0c2f2f3\") " pod="kuadrant-system/authorino-68bd676465-2lhrl" Apr 23 16:46:33.715747 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.715729 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwrrz\" (UniqueName: \"kubernetes.io/projected/31965c06-acb6-474c-8ce9-ca34c0c2f2f3-kube-api-access-qwrrz\") pod \"authorino-68bd676465-2lhrl\" (UID: \"31965c06-acb6-474c-8ce9-ca34c0c2f2f3\") " pod="kuadrant-system/authorino-68bd676465-2lhrl" Apr 23 16:46:33.882732 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:33.882611 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-2lhrl" Apr 23 16:46:34.008359 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:34.008334 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-2lhrl"] Apr 23 16:46:34.010490 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:46:34.010459 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31965c06_acb6_474c_8ce9_ca34c0c2f2f3.slice/crio-b92822124fa6a6adfcc558b4b29fa7105be594906cf824885a1946d50c53095a WatchSource:0}: Error finding container b92822124fa6a6adfcc558b4b29fa7105be594906cf824885a1946d50c53095a: Status 404 returned error can't find the container with id b92822124fa6a6adfcc558b4b29fa7105be594906cf824885a1946d50c53095a Apr 23 16:46:34.011785 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:34.011766 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:46:34.666491 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:34.666454 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-2lhrl" event={"ID":"31965c06-acb6-474c-8ce9-ca34c0c2f2f3","Type":"ContainerStarted","Data":"aeb517fcc2184ee51ff8ed638ffb859b41650dcef5cc6fc660d36db6c8e427e5"} Apr 23 16:46:34.666491 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:34.666494 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-2lhrl" event={"ID":"31965c06-acb6-474c-8ce9-ca34c0c2f2f3","Type":"ContainerStarted","Data":"b92822124fa6a6adfcc558b4b29fa7105be594906cf824885a1946d50c53095a"} Apr 23 16:46:34.706717 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:34.706599 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-2lhrl" podStartSLOduration=1.26146326 podStartE2EDuration="1.706582737s" podCreationTimestamp="2026-04-23 16:46:33 +0000 UTC" firstStartedPulling="2026-04-23 16:46:34.011924416 +0000 UTC m=+677.093366962" lastFinishedPulling="2026-04-23 16:46:34.457043887 +0000 UTC m=+677.538486439" observedRunningTime="2026-04-23 16:46:34.681646353 +0000 UTC m=+677.763088934" watchObservedRunningTime="2026-04-23 16:46:34.706582737 +0000 UTC m=+677.788025306" Apr 23 16:46:34.707080 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:34.707060 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-vq8fm"] Apr 23 16:46:34.707292 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:34.707271 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-vq8fm" podUID="9bc56d86-6c74-4a1f-a7d2-49efe68a9c97" containerName="authorino" containerID="cri-o://fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7" gracePeriod=30 Apr 23 16:46:34.947555 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:34.947532 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-vq8fm" Apr 23 16:46:35.019618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.019526 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4vcv\" (UniqueName: \"kubernetes.io/projected/9bc56d86-6c74-4a1f-a7d2-49efe68a9c97-kube-api-access-w4vcv\") pod \"9bc56d86-6c74-4a1f-a7d2-49efe68a9c97\" (UID: \"9bc56d86-6c74-4a1f-a7d2-49efe68a9c97\") " Apr 23 16:46:35.021776 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.021750 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc56d86-6c74-4a1f-a7d2-49efe68a9c97-kube-api-access-w4vcv" (OuterVolumeSpecName: "kube-api-access-w4vcv") pod "9bc56d86-6c74-4a1f-a7d2-49efe68a9c97" (UID: "9bc56d86-6c74-4a1f-a7d2-49efe68a9c97"). InnerVolumeSpecName "kube-api-access-w4vcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:46:35.121056 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.121020 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4vcv\" (UniqueName: \"kubernetes.io/projected/9bc56d86-6c74-4a1f-a7d2-49efe68a9c97-kube-api-access-w4vcv\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:46:35.671299 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.671261 2571 generic.go:358] "Generic (PLEG): container finished" podID="9bc56d86-6c74-4a1f-a7d2-49efe68a9c97" containerID="fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7" exitCode=0 Apr 23 16:46:35.671753 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.671309 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-vq8fm" Apr 23 16:46:35.671753 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.671350 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-vq8fm" event={"ID":"9bc56d86-6c74-4a1f-a7d2-49efe68a9c97","Type":"ContainerDied","Data":"fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7"} Apr 23 16:46:35.671753 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.671391 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-vq8fm" event={"ID":"9bc56d86-6c74-4a1f-a7d2-49efe68a9c97","Type":"ContainerDied","Data":"f564d4601a5a7423fd104231b432c17edbde5845e3ceac28b99f2f44770e7a53"} Apr 23 16:46:35.671753 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.671408 2571 scope.go:117] "RemoveContainer" containerID="fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7" Apr 23 16:46:35.679786 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.679769 2571 scope.go:117] "RemoveContainer" containerID="fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7" Apr 23 16:46:35.680041 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:46:35.680023 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7\": container with ID starting with fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7 not found: ID does not exist" containerID="fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7" Apr 23 16:46:35.680078 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.680052 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7"} err="failed to get container status \"fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7\": rpc error: code = NotFound desc = could not find container \"fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7\": container with ID starting with fde34a1fbc34adc74a3a9836007358469b9a2cd997d477a21ef5d355b070c5e7 not found: ID does not exist" Apr 23 16:46:35.688942 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.688911 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-vq8fm"] Apr 23 16:46:35.689638 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:35.689620 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-vq8fm"] Apr 23 16:46:37.511386 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:46:37.511350 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc56d86-6c74-4a1f-a7d2-49efe68a9c97" path="/var/lib/kubelet/pods/9bc56d86-6c74-4a1f-a7d2-49efe68a9c97/volumes" Apr 23 16:48:41.517741 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.517639 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7"] Apr 23 16:48:41.518342 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.518006 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bc56d86-6c74-4a1f-a7d2-49efe68a9c97" containerName="authorino" Apr 23 16:48:41.518342 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.518016 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc56d86-6c74-4a1f-a7d2-49efe68a9c97" containerName="authorino" Apr 23 16:48:41.518342 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.518070 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9bc56d86-6c74-4a1f-a7d2-49efe68a9c97" containerName="authorino" Apr 23 16:48:41.520191 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.520164 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.523219 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.523199 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 16:48:41.523538 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.523513 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 16:48:41.523648 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.523559 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-vfqws\"" Apr 23 16:48:41.524505 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.524485 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 23 16:48:41.534732 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.534712 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7"] Apr 23 16:48:41.625060 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.625024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.625060 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.625064 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/44622e9d-844d-4a1a-b4cb-523054e59ce5-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.625255 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.625096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.625255 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.625139 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.625255 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.625227 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.625255 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.625245 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.625395 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.625280 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.625395 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.625299 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzqpz\" (UniqueName: \"kubernetes.io/projected/44622e9d-844d-4a1a-b4cb-523054e59ce5-kube-api-access-nzqpz\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.625395 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.625321 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.726433 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.726396 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.726626 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.726440 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzqpz\" (UniqueName: \"kubernetes.io/projected/44622e9d-844d-4a1a-b4cb-523054e59ce5-kube-api-access-nzqpz\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.726626 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.726478 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.726626 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.726564 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.726626 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.726601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/44622e9d-844d-4a1a-b4cb-523054e59ce5-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.726908 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.726673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.726908 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.726764 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.726908 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.726830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.726908 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.726854 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.727142 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.726990 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.727142 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.727037 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.727249 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.727218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.727415 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.727392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.727486 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.727395 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/44622e9d-844d-4a1a-b4cb-523054e59ce5-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.729072 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.729056 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.729341 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.729324 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.735365 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.735338 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/44622e9d-844d-4a1a-b4cb-523054e59ce5-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.735459 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.735365 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzqpz\" (UniqueName: \"kubernetes.io/projected/44622e9d-844d-4a1a-b4cb-523054e59ce5-kube-api-access-nzqpz\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wzxm7\" (UID: \"44622e9d-844d-4a1a-b4cb-523054e59ce5\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.834014 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.833940 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:41.957434 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.957408 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7"] Apr 23 16:48:41.959281 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:48:41.959254 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44622e9d_844d_4a1a_b4cb_523054e59ce5.slice/crio-6f8dba6a5a25af7f38bf5871aece5a266350226772c3df9dfcf7fb3cf1b24ff1 WatchSource:0}: Error finding container 6f8dba6a5a25af7f38bf5871aece5a266350226772c3df9dfcf7fb3cf1b24ff1: Status 404 returned error can't find the container with id 6f8dba6a5a25af7f38bf5871aece5a266350226772c3df9dfcf7fb3cf1b24ff1 Apr 23 16:48:41.961457 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.961422 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 23 16:48:41.961533 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.961503 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 23 16:48:41.961592 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:41.961538 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 23 16:48:42.104150 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:42.104072 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" event={"ID":"44622e9d-844d-4a1a-b4cb-523054e59ce5","Type":"ContainerStarted","Data":"17dc526dac6fa200ad74d184a7262027fcac2e733fff45b762c03d94f28df29f"} Apr 23 16:48:42.104150 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:42.104117 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" event={"ID":"44622e9d-844d-4a1a-b4cb-523054e59ce5","Type":"ContainerStarted","Data":"6f8dba6a5a25af7f38bf5871aece5a266350226772c3df9dfcf7fb3cf1b24ff1"} Apr 23 16:48:42.124582 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:42.124534 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" podStartSLOduration=1.124516126 podStartE2EDuration="1.124516126s" podCreationTimestamp="2026-04-23 16:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:48:42.121279805 +0000 UTC m=+805.202722372" watchObservedRunningTime="2026-04-23 16:48:42.124516126 +0000 UTC m=+805.205958695" Apr 23 16:48:42.834083 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:42.834052 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:43.839494 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:43.839464 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:44.112398 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:44.112317 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:44.113251 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:44.113232 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wzxm7" Apr 23 16:48:59.676536 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.676489 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k"] Apr 23 16:48:59.679803 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.679787 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.682442 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.682416 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b4jmv\"" Apr 23 16:48:59.683343 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.683314 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 23 16:48:59.683343 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.683332 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-886m6\"" Apr 23 16:48:59.690636 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.690613 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k"] Apr 23 16:48:59.776736 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.776688 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.776878 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.776741 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.776878 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.776797 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/28079ff7-8530-49a1-a01c-d51d759724c8-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.776878 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.776814 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7jdf\" (UniqueName: \"kubernetes.io/projected/28079ff7-8530-49a1-a01c-d51d759724c8-kube-api-access-p7jdf\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.776975 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.776898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.776975 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.776940 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.877323 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.877283 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.877498 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.877337 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.877498 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.877360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.877498 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.877403 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/28079ff7-8530-49a1-a01c-d51d759724c8-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.877658 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.877523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7jdf\" (UniqueName: \"kubernetes.io/projected/28079ff7-8530-49a1-a01c-d51d759724c8-kube-api-access-p7jdf\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.877658 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.877650 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.877797 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.877679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.877797 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.877760 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.877891 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.877822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.877994 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.877975 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.880013 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.879994 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/28079ff7-8530-49a1-a01c-d51d759724c8-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.885293 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.885269 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7jdf\" (UniqueName: \"kubernetes.io/projected/28079ff7-8530-49a1-a01c-d51d759724c8-kube-api-access-p7jdf\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:48:59.993235 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:48:59.993205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:49:00.123068 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:00.123036 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k"] Apr 23 16:49:00.125561 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:49:00.125530 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28079ff7_8530_49a1_a01c_d51d759724c8.slice/crio-20b0202df1bb22b7e6e2e2e2fe5a639092ef7dff52f8f675976fcacf607f0291 WatchSource:0}: Error finding container 20b0202df1bb22b7e6e2e2e2fe5a639092ef7dff52f8f675976fcacf607f0291: Status 404 returned error can't find the container with id 20b0202df1bb22b7e6e2e2e2fe5a639092ef7dff52f8f675976fcacf607f0291 Apr 23 16:49:00.167715 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:00.167664 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" event={"ID":"28079ff7-8530-49a1-a01c-d51d759724c8","Type":"ContainerStarted","Data":"20b0202df1bb22b7e6e2e2e2fe5a639092ef7dff52f8f675976fcacf607f0291"} Apr 23 16:49:04.184649 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:04.184613 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" event={"ID":"28079ff7-8530-49a1-a01c-d51d759724c8","Type":"ContainerStarted","Data":"61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e"} Apr 23 16:49:05.188886 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:05.188856 2571 generic.go:358] "Generic (PLEG): container finished" podID="28079ff7-8530-49a1-a01c-d51d759724c8" containerID="61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e" exitCode=0 Apr 23 16:49:05.188886 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:05.188891 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" event={"ID":"28079ff7-8530-49a1-a01c-d51d759724c8","Type":"ContainerDied","Data":"61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e"} Apr 23 16:49:07.199187 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:07.199154 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" event={"ID":"28079ff7-8530-49a1-a01c-d51d759724c8","Type":"ContainerStarted","Data":"7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41"} Apr 23 16:49:36.322391 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:36.322353 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" event={"ID":"28079ff7-8530-49a1-a01c-d51d759724c8","Type":"ContainerStarted","Data":"47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1"} Apr 23 16:49:36.322945 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:36.322722 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:49:36.325423 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:36.325403 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:49:36.346841 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:36.346786 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" podStartSLOduration=2.031704104 podStartE2EDuration="37.346771204s" podCreationTimestamp="2026-04-23 16:48:59 +0000 UTC" firstStartedPulling="2026-04-23 16:49:00.127654881 +0000 UTC m=+823.209097427" lastFinishedPulling="2026-04-23 16:49:35.442721976 +0000 UTC m=+858.524164527" observedRunningTime="2026-04-23 16:49:36.343839267 +0000 UTC m=+859.425281845" watchObservedRunningTime="2026-04-23 16:49:36.346771204 +0000 UTC m=+859.428213773" Apr 23 16:49:39.993949 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:39.993915 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:49:39.994358 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:39.993965 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:49:49.995716 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:49.995670 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:49:49.996940 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:49.996921 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:49:51.010568 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:51.010537 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k"] Apr 23 16:49:51.377258 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:51.377154 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" containerName="main" containerID="cri-o://7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41" gracePeriod=30 Apr 23 16:49:51.377258 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:51.377194 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" containerName="tokenizer" containerID="cri-o://47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1" gracePeriod=30 Apr 23 16:49:52.382674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.382638 2571 generic.go:358] "Generic (PLEG): container finished" podID="28079ff7-8530-49a1-a01c-d51d759724c8" containerID="7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41" exitCode=0 Apr 23 16:49:52.383100 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.382714 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" event={"ID":"28079ff7-8530-49a1-a01c-d51d759724c8","Type":"ContainerDied","Data":"7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41"} Apr 23 16:49:52.549204 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.549182 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:49:52.713503 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.713464 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-uds\") pod \"28079ff7-8530-49a1-a01c-d51d759724c8\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " Apr 23 16:49:52.713743 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.713584 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-kserve-provision-location\") pod \"28079ff7-8530-49a1-a01c-d51d759724c8\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " Apr 23 16:49:52.713743 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.713646 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/28079ff7-8530-49a1-a01c-d51d759724c8-tls-certs\") pod \"28079ff7-8530-49a1-a01c-d51d759724c8\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " Apr 23 16:49:52.713743 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.713681 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-tmp\") pod \"28079ff7-8530-49a1-a01c-d51d759724c8\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " Apr 23 16:49:52.713913 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.713751 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-cache\") pod \"28079ff7-8530-49a1-a01c-d51d759724c8\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " Apr 23 16:49:52.713913 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.713787 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7jdf\" (UniqueName: \"kubernetes.io/projected/28079ff7-8530-49a1-a01c-d51d759724c8-kube-api-access-p7jdf\") pod \"28079ff7-8530-49a1-a01c-d51d759724c8\" (UID: \"28079ff7-8530-49a1-a01c-d51d759724c8\") " Apr 23 16:49:52.713913 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.713798 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "28079ff7-8530-49a1-a01c-d51d759724c8" (UID: "28079ff7-8530-49a1-a01c-d51d759724c8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:49:52.714052 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.714022 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "28079ff7-8530-49a1-a01c-d51d759724c8" (UID: "28079ff7-8530-49a1-a01c-d51d759724c8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:49:52.714099 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.714070 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "28079ff7-8530-49a1-a01c-d51d759724c8" (UID: "28079ff7-8530-49a1-a01c-d51d759724c8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:49:52.714164 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.714148 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-tmp\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:49:52.714221 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.714167 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:49:52.714221 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.714176 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-tokenizer-uds\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:49:52.714413 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.714381 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "28079ff7-8530-49a1-a01c-d51d759724c8" (UID: "28079ff7-8530-49a1-a01c-d51d759724c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:49:52.716229 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.716207 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28079ff7-8530-49a1-a01c-d51d759724c8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "28079ff7-8530-49a1-a01c-d51d759724c8" (UID: "28079ff7-8530-49a1-a01c-d51d759724c8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:49:52.716313 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.716286 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28079ff7-8530-49a1-a01c-d51d759724c8-kube-api-access-p7jdf" (OuterVolumeSpecName: "kube-api-access-p7jdf") pod "28079ff7-8530-49a1-a01c-d51d759724c8" (UID: "28079ff7-8530-49a1-a01c-d51d759724c8"). InnerVolumeSpecName "kube-api-access-p7jdf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:49:52.815303 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.815274 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/28079ff7-8530-49a1-a01c-d51d759724c8-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:49:52.815303 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.815302 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/28079ff7-8530-49a1-a01c-d51d759724c8-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:49:52.815459 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:52.815314 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7jdf\" (UniqueName: \"kubernetes.io/projected/28079ff7-8530-49a1-a01c-d51d759724c8-kube-api-access-p7jdf\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:49:53.388302 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.388270 2571 generic.go:358] "Generic (PLEG): container finished" podID="28079ff7-8530-49a1-a01c-d51d759724c8" containerID="47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1" exitCode=0 Apr 23 16:49:53.388741 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.388323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" event={"ID":"28079ff7-8530-49a1-a01c-d51d759724c8","Type":"ContainerDied","Data":"47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1"} Apr 23 16:49:53.388741 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.388345 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" Apr 23 16:49:53.388741 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.388359 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k" event={"ID":"28079ff7-8530-49a1-a01c-d51d759724c8","Type":"ContainerDied","Data":"20b0202df1bb22b7e6e2e2e2fe5a639092ef7dff52f8f675976fcacf607f0291"} Apr 23 16:49:53.388741 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.388375 2571 scope.go:117] "RemoveContainer" containerID="47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1" Apr 23 16:49:53.397906 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.397889 2571 scope.go:117] "RemoveContainer" containerID="7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41" Apr 23 16:49:53.405302 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.405288 2571 scope.go:117] "RemoveContainer" containerID="61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e" Apr 23 16:49:53.410873 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.410848 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k"] Apr 23 16:49:53.413445 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.413426 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59489n922k"] Apr 23 16:49:53.414143 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.414128 2571 scope.go:117] "RemoveContainer" containerID="47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1" Apr 23 16:49:53.414418 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:49:53.414400 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1\": container with ID starting with 47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1 not found: ID does not exist" containerID="47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1" Apr 23 16:49:53.414462 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.414426 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1"} err="failed to get container status \"47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1\": rpc error: code = NotFound desc = could not find container \"47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1\": container with ID starting with 47a2bad79b23f487d2868e29687d531593ed6e8bf5533c58aa84064e8d9aeeb1 not found: ID does not exist" Apr 23 16:49:53.414462 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.414445 2571 scope.go:117] "RemoveContainer" containerID="7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41" Apr 23 16:49:53.414663 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:49:53.414648 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41\": container with ID starting with 7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41 not found: ID does not exist" containerID="7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41" Apr 23 16:49:53.414727 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.414668 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41"} err="failed to get container status \"7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41\": rpc error: code = NotFound desc = could not find container \"7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41\": container with ID starting with 7b009d2c9b6ca7c74e4401ae25e240afcb8a57b080dec1aa4b913b20a5b9da41 not found: ID does not exist" Apr 23 16:49:53.414727 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.414680 2571 scope.go:117] "RemoveContainer" containerID="61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e" Apr 23 16:49:53.414954 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:49:53.414935 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e\": container with ID starting with 61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e not found: ID does not exist" containerID="61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e" Apr 23 16:49:53.415054 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.415028 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e"} err="failed to get container status \"61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e\": rpc error: code = NotFound desc = could not find container \"61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e\": container with ID starting with 61344b9551388a01da1550a907c5a4fba8b2372c8562e495765ee9616553962e not found: ID does not exist" Apr 23 16:49:53.518721 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:49:53.518656 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" path="/var/lib/kubelet/pods/28079ff7-8530-49a1-a01c-d51d759724c8/volumes" Apr 23 16:50:11.458595 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.458508 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7"] Apr 23 16:50:11.459179 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.459061 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" containerName="tokenizer" Apr 23 16:50:11.459179 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.459079 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" containerName="tokenizer" Apr 23 16:50:11.459179 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.459124 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" containerName="storage-initializer" Apr 23 16:50:11.459179 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.459133 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" containerName="storage-initializer" Apr 23 16:50:11.459179 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.459143 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" containerName="main" Apr 23 16:50:11.459179 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.459151 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" containerName="main" Apr 23 16:50:11.459502 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.459231 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" containerName="main" Apr 23 16:50:11.459502 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.459243 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="28079ff7-8530-49a1-a01c-d51d759724c8" containerName="tokenizer" Apr 23 16:50:11.462545 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.462522 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.466371 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.466347 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b4jmv\"" Apr 23 16:50:11.466478 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.466351 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 23 16:50:11.469529 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.469506 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7"] Apr 23 16:50:11.479088 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.479057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwxgg\" (UniqueName: \"kubernetes.io/projected/6b351b34-680d-4b12-863f-78b94ef19798-kube-api-access-vwxgg\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.479301 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.479280 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b351b34-680d-4b12-863f-78b94ef19798-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.479453 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.479429 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.479589 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.479484 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-dshm\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.479589 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.479524 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-model-cache\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.479589 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.479583 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-home\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.581137 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.581108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.581311 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.581150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-dshm\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.581311 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.581174 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-model-cache\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.581415 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.581325 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-home\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.581495 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.581463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwxgg\" (UniqueName: \"kubernetes.io/projected/6b351b34-680d-4b12-863f-78b94ef19798-kube-api-access-vwxgg\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.581556 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.581514 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.581635 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.581521 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b351b34-680d-4b12-863f-78b94ef19798-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.581690 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.581661 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-home\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.581690 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.581614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-model-cache\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.583526 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.583504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-dshm\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.583890 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.583875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b351b34-680d-4b12-863f-78b94ef19798-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.590936 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.590917 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwxgg\" (UniqueName: \"kubernetes.io/projected/6b351b34-680d-4b12-863f-78b94ef19798-kube-api-access-vwxgg\") pod \"scheduler-ha-replicas-test-kserve-67984fd5c9-klff7\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.780665 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.780589 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82"] Apr 23 16:50:11.784542 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.784518 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.787149 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.787124 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-bz9j9\"" Apr 23 16:50:11.790931 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.790909 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:11.798159 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.798133 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82"] Apr 23 16:50:11.884359 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.884311 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.884543 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.884514 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.884681 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.884661 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.884782 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.884732 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.884782 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.884763 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnjx4\" (UniqueName: \"kubernetes.io/projected/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kube-api-access-bnjx4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.884892 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.884844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.919275 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.919203 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7"] Apr 23 16:50:11.921562 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:50:11.921533 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b351b34_680d_4b12_863f_78b94ef19798.slice/crio-6761111d6e90ab8a96498292e7bc3ab7f07ce64c822e230ce7b35062903170a6 WatchSource:0}: Error finding container 6761111d6e90ab8a96498292e7bc3ab7f07ce64c822e230ce7b35062903170a6: Status 404 returned error can't find the container with id 6761111d6e90ab8a96498292e7bc3ab7f07ce64c822e230ce7b35062903170a6 Apr 23 16:50:11.986131 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.986105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.986212 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.986156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.986212 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.986192 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.986292 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.986224 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnjx4\" (UniqueName: \"kubernetes.io/projected/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kube-api-access-bnjx4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.986292 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.986275 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.986391 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.986327 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.986573 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.986535 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.986790 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.986756 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.986790 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.986770 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.986953 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.986776 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:11.988734 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:11.988713 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:12.015226 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:12.015205 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnjx4\" (UniqueName: \"kubernetes.io/projected/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kube-api-access-bnjx4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:12.096545 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:12.096454 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:12.232040 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:12.232011 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82"] Apr 23 16:50:12.233771 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:50:12.233727 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8dc3075_18c7_4ecf_ba68_3f9759d5f02c.slice/crio-33b0807a77bd71aa6b8640a6e5a941a814f8334ce0ffd6f18421e1c4c2f3d5b4 WatchSource:0}: Error finding container 33b0807a77bd71aa6b8640a6e5a941a814f8334ce0ffd6f18421e1c4c2f3d5b4: Status 404 returned error can't find the container with id 33b0807a77bd71aa6b8640a6e5a941a814f8334ce0ffd6f18421e1c4c2f3d5b4 Apr 23 16:50:12.459799 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:12.459756 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" event={"ID":"6b351b34-680d-4b12-863f-78b94ef19798","Type":"ContainerStarted","Data":"4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9"} Apr 23 16:50:12.459799 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:12.459803 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" event={"ID":"6b351b34-680d-4b12-863f-78b94ef19798","Type":"ContainerStarted","Data":"6761111d6e90ab8a96498292e7bc3ab7f07ce64c822e230ce7b35062903170a6"} Apr 23 16:50:12.461292 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:12.461259 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" event={"ID":"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c","Type":"ContainerStarted","Data":"54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1"} Apr 23 16:50:12.461426 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:12.461299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" event={"ID":"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c","Type":"ContainerStarted","Data":"33b0807a77bd71aa6b8640a6e5a941a814f8334ce0ffd6f18421e1c4c2f3d5b4"} Apr 23 16:50:13.467404 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:13.467368 2571 generic.go:358] "Generic (PLEG): container finished" podID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerID="54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1" exitCode=0 Apr 23 16:50:13.467825 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:13.467458 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" event={"ID":"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c","Type":"ContainerDied","Data":"54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1"} Apr 23 16:50:14.478499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:14.478457 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" event={"ID":"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c","Type":"ContainerStarted","Data":"9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5"} Apr 23 16:50:14.478925 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:14.478508 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" event={"ID":"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c","Type":"ContainerStarted","Data":"f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0"} Apr 23 16:50:14.478925 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:14.478616 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:14.501957 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:14.501887 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" podStartSLOduration=3.501865173 podStartE2EDuration="3.501865173s" podCreationTimestamp="2026-04-23 16:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:50:14.498038905 +0000 UTC m=+897.579481478" watchObservedRunningTime="2026-04-23 16:50:14.501865173 +0000 UTC m=+897.583307743" Apr 23 16:50:16.488024 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:16.487987 2571 generic.go:358] "Generic (PLEG): container finished" podID="6b351b34-680d-4b12-863f-78b94ef19798" containerID="4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9" exitCode=0 Apr 23 16:50:16.488429 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:16.488056 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" event={"ID":"6b351b34-680d-4b12-863f-78b94ef19798","Type":"ContainerDied","Data":"4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9"} Apr 23 16:50:17.449066 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:17.449030 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:50:17.451884 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:17.451863 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:50:18.497645 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:18.497615 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" event={"ID":"6b351b34-680d-4b12-863f-78b94ef19798","Type":"ContainerStarted","Data":"28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938"} Apr 23 16:50:18.518834 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:18.518777 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" podStartSLOduration=6.057416302 podStartE2EDuration="7.518757899s" podCreationTimestamp="2026-04-23 16:50:11 +0000 UTC" firstStartedPulling="2026-04-23 16:50:16.489297701 +0000 UTC m=+899.570740247" lastFinishedPulling="2026-04-23 16:50:17.950639294 +0000 UTC m=+901.032081844" observedRunningTime="2026-04-23 16:50:18.516045821 +0000 UTC m=+901.597488431" watchObservedRunningTime="2026-04-23 16:50:18.518757899 +0000 UTC m=+901.600200473" Apr 23 16:50:21.791149 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:21.791109 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:21.791492 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:21.791162 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:21.803941 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:21.803900 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:22.097734 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:22.097616 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:22.097891 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:22.097767 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:22.100485 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:22.100462 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:22.512289 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:22.512261 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:50:22.522441 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:22.522416 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:50:28.475817 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.475775 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l"] Apr 23 16:50:28.518533 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.518494 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l"] Apr 23 16:50:28.518737 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.518712 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.521439 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.521416 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-8jnhr\"" Apr 23 16:50:28.521570 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.521560 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 23 16:50:28.646085 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.646041 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96a37355-39be-42da-99f2-891a94f08962-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.646276 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.646112 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.646276 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.646155 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.646276 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.646192 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb59n\" (UniqueName: \"kubernetes.io/projected/96a37355-39be-42da-99f2-891a94f08962-kube-api-access-sb59n\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.646398 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.646312 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.646587 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.646567 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.747824 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.747746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.747824 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.747783 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb59n\" (UniqueName: \"kubernetes.io/projected/96a37355-39be-42da-99f2-891a94f08962-kube-api-access-sb59n\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.747824 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.747817 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.748056 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.747888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.748056 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.747940 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96a37355-39be-42da-99f2-891a94f08962-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.748056 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.747979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.748194 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.748170 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.748262 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.748244 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.748297 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.748282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.748340 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.748326 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.750624 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.750600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96a37355-39be-42da-99f2-891a94f08962-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.764428 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.764404 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb59n\" (UniqueName: \"kubernetes.io/projected/96a37355-39be-42da-99f2-891a94f08962-kube-api-access-sb59n\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.827578 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.827546 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:28.954318 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:28.954294 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l"] Apr 23 16:50:28.955756 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:50:28.955728 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a37355_39be_42da_99f2_891a94f08962.slice/crio-649f1f0fb8a8602f64958198afbc88e8f935b2156fe565d13452f03361d87f16 WatchSource:0}: Error finding container 649f1f0fb8a8602f64958198afbc88e8f935b2156fe565d13452f03361d87f16: Status 404 returned error can't find the container with id 649f1f0fb8a8602f64958198afbc88e8f935b2156fe565d13452f03361d87f16 Apr 23 16:50:29.541342 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:29.541305 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" event={"ID":"96a37355-39be-42da-99f2-891a94f08962","Type":"ContainerStarted","Data":"230b87c4f0f2f0bda73c0c2735864336118228d5220199e4ea0eb8d9290a865a"} Apr 23 16:50:29.541342 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:29.541347 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" event={"ID":"96a37355-39be-42da-99f2-891a94f08962","Type":"ContainerStarted","Data":"649f1f0fb8a8602f64958198afbc88e8f935b2156fe565d13452f03361d87f16"} Apr 23 16:50:30.546751 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:30.546712 2571 generic.go:358] "Generic (PLEG): container finished" podID="96a37355-39be-42da-99f2-891a94f08962" containerID="230b87c4f0f2f0bda73c0c2735864336118228d5220199e4ea0eb8d9290a865a" exitCode=0 Apr 23 16:50:30.547135 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:30.546797 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" event={"ID":"96a37355-39be-42da-99f2-891a94f08962","Type":"ContainerDied","Data":"230b87c4f0f2f0bda73c0c2735864336118228d5220199e4ea0eb8d9290a865a"} Apr 23 16:50:31.552618 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:31.552579 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" event={"ID":"96a37355-39be-42da-99f2-891a94f08962","Type":"ContainerStarted","Data":"b0b27b4103b2183b5e6f7d326285ebf33a1490c21894ea502a9f7eec3e4b2924"} Apr 23 16:50:31.553033 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:31.552626 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" event={"ID":"96a37355-39be-42da-99f2-891a94f08962","Type":"ContainerStarted","Data":"69c2d0658c6f227f8709b8bfd262854eddfd000ac72e662536732df9bc5c72b3"} Apr 23 16:50:31.553033 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:31.552741 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:31.575762 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:31.575720 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" podStartSLOduration=3.575707742 podStartE2EDuration="3.575707742s" podCreationTimestamp="2026-04-23 16:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:50:31.573034776 +0000 UTC m=+914.654477348" watchObservedRunningTime="2026-04-23 16:50:31.575707742 +0000 UTC m=+914.657150317" Apr 23 16:50:38.828776 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:38.828733 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:38.828776 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:38.828785 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:38.831593 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:38.831561 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:39.581851 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:39.581818 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:50:44.518584 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:50:44.518558 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:51:00.585107 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:00.585079 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:51:01.252434 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.252388 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82"] Apr 23 16:51:01.252828 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.252778 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerName="main" containerID="cri-o://f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0" gracePeriod=30 Apr 23 16:51:01.252957 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.252839 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerName="tokenizer" containerID="cri-o://9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5" gracePeriod=30 Apr 23 16:51:01.254882 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.254855 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7"] Apr 23 16:51:01.255169 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.255147 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" podUID="6b351b34-680d-4b12-863f-78b94ef19798" containerName="main" containerID="cri-o://28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938" gracePeriod=30 Apr 23 16:51:01.511689 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.511606 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:51:01.565066 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.565022 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-home\") pod \"6b351b34-680d-4b12-863f-78b94ef19798\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " Apr 23 16:51:01.565066 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.565070 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-dshm\") pod \"6b351b34-680d-4b12-863f-78b94ef19798\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " Apr 23 16:51:01.565315 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.565112 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwxgg\" (UniqueName: \"kubernetes.io/projected/6b351b34-680d-4b12-863f-78b94ef19798-kube-api-access-vwxgg\") pod \"6b351b34-680d-4b12-863f-78b94ef19798\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " Apr 23 16:51:01.565315 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.565135 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b351b34-680d-4b12-863f-78b94ef19798-tls-certs\") pod \"6b351b34-680d-4b12-863f-78b94ef19798\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " Apr 23 16:51:01.565315 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.565160 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-kserve-provision-location\") pod \"6b351b34-680d-4b12-863f-78b94ef19798\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " Apr 23 16:51:01.565315 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.565210 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-model-cache\") pod \"6b351b34-680d-4b12-863f-78b94ef19798\" (UID: \"6b351b34-680d-4b12-863f-78b94ef19798\") " Apr 23 16:51:01.565315 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.565283 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-home" (OuterVolumeSpecName: "home") pod "6b351b34-680d-4b12-863f-78b94ef19798" (UID: "6b351b34-680d-4b12-863f-78b94ef19798"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:01.565631 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.565609 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-home\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:01.565785 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.565733 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-model-cache" (OuterVolumeSpecName: "model-cache") pod "6b351b34-680d-4b12-863f-78b94ef19798" (UID: "6b351b34-680d-4b12-863f-78b94ef19798"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:01.567639 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.567606 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b351b34-680d-4b12-863f-78b94ef19798-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6b351b34-680d-4b12-863f-78b94ef19798" (UID: "6b351b34-680d-4b12-863f-78b94ef19798"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:51:01.567933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.567912 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-dshm" (OuterVolumeSpecName: "dshm") pod "6b351b34-680d-4b12-863f-78b94ef19798" (UID: "6b351b34-680d-4b12-863f-78b94ef19798"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:01.568021 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.567923 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b351b34-680d-4b12-863f-78b94ef19798-kube-api-access-vwxgg" (OuterVolumeSpecName: "kube-api-access-vwxgg") pod "6b351b34-680d-4b12-863f-78b94ef19798" (UID: "6b351b34-680d-4b12-863f-78b94ef19798"). InnerVolumeSpecName "kube-api-access-vwxgg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:51:01.621472 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.621413 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6b351b34-680d-4b12-863f-78b94ef19798" (UID: "6b351b34-680d-4b12-863f-78b94ef19798"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:01.663216 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.663185 2571 generic.go:358] "Generic (PLEG): container finished" podID="6b351b34-680d-4b12-863f-78b94ef19798" containerID="28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938" exitCode=0 Apr 23 16:51:01.663407 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.663266 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" Apr 23 16:51:01.663407 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.663275 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" event={"ID":"6b351b34-680d-4b12-863f-78b94ef19798","Type":"ContainerDied","Data":"28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938"} Apr 23 16:51:01.663407 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.663314 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7" event={"ID":"6b351b34-680d-4b12-863f-78b94ef19798","Type":"ContainerDied","Data":"6761111d6e90ab8a96498292e7bc3ab7f07ce64c822e230ce7b35062903170a6"} Apr 23 16:51:01.663407 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.663333 2571 scope.go:117] "RemoveContainer" containerID="28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938" Apr 23 16:51:01.665382 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.665324 2571 generic.go:358] "Generic (PLEG): container finished" podID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerID="f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0" exitCode=0 Apr 23 16:51:01.665382 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.665372 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" event={"ID":"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c","Type":"ContainerDied","Data":"f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0"} Apr 23 16:51:01.666102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.666085 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-dshm\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:01.666219 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.666107 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwxgg\" (UniqueName: \"kubernetes.io/projected/6b351b34-680d-4b12-863f-78b94ef19798-kube-api-access-vwxgg\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:01.666219 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.666121 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b351b34-680d-4b12-863f-78b94ef19798-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:01.666219 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.666132 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:01.666219 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.666140 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b351b34-680d-4b12-863f-78b94ef19798-model-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:01.673557 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.673542 2571 scope.go:117] "RemoveContainer" containerID="4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9" Apr 23 16:51:01.684050 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.684018 2571 scope.go:117] "RemoveContainer" containerID="28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938" Apr 23 16:51:01.684340 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:51:01.684321 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938\": container with ID starting with 28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938 not found: ID does not exist" containerID="28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938" Apr 23 16:51:01.684387 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.684351 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938"} err="failed to get container status \"28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938\": rpc error: code = NotFound desc = could not find container \"28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938\": container with ID starting with 28da45cbeebe1dcb351df37861f3e8db563b6832ec486fcc78d4437456cc1938 not found: ID does not exist" Apr 23 16:51:01.684387 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.684372 2571 scope.go:117] "RemoveContainer" containerID="4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9" Apr 23 16:51:01.684652 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:51:01.684617 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9\": container with ID starting with 4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9 not found: ID does not exist" containerID="4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9" Apr 23 16:51:01.684746 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.684650 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9"} err="failed to get container status \"4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9\": rpc error: code = NotFound desc = could not find container \"4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9\": container with ID starting with 4cc73b9b83c8e40d1bd0517678a5227af5512facf01bc6e3a46ec9bbae2179a9 not found: ID does not exist" Apr 23 16:51:01.688328 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.688301 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7"] Apr 23 16:51:01.691787 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:01.691755 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67984fd5c9-klff7"] Apr 23 16:51:02.497237 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.497213 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:51:02.574435 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.574398 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-cache\") pod \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " Apr 23 16:51:02.574435 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.574438 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tls-certs\") pod \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " Apr 23 16:51:02.574681 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.574459 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnjx4\" (UniqueName: \"kubernetes.io/projected/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kube-api-access-bnjx4\") pod \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " Apr 23 16:51:02.574681 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.574535 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kserve-provision-location\") pod \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " Apr 23 16:51:02.574681 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.574572 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-uds\") pod \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " Apr 23 16:51:02.574681 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.574629 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-tmp\") pod \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\" (UID: \"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c\") " Apr 23 16:51:02.574925 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.574761 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" (UID: "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:02.574925 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.574885 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" (UID: "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:02.574925 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.574913 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:02.575028 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.574990 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" (UID: "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:02.575274 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.575255 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" (UID: "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:02.577203 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.577183 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" (UID: "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:51:02.577287 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.577199 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kube-api-access-bnjx4" (OuterVolumeSpecName: "kube-api-access-bnjx4") pod "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" (UID: "c8dc3075-18c7-4ecf-ba68-3f9759d5f02c"). InnerVolumeSpecName "kube-api-access-bnjx4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:51:02.670584 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.670554 2571 generic.go:358] "Generic (PLEG): container finished" podID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerID="9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5" exitCode=0 Apr 23 16:51:02.671011 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.670637 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" Apr 23 16:51:02.671011 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.670639 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" event={"ID":"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c","Type":"ContainerDied","Data":"9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5"} Apr 23 16:51:02.671011 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.670758 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82" event={"ID":"c8dc3075-18c7-4ecf-ba68-3f9759d5f02c","Type":"ContainerDied","Data":"33b0807a77bd71aa6b8640a6e5a941a814f8334ce0ffd6f18421e1c4c2f3d5b4"} Apr 23 16:51:02.671011 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.670786 2571 scope.go:117] "RemoveContainer" containerID="9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5" Apr 23 16:51:02.675492 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.675468 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:02.675612 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.675497 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bnjx4\" (UniqueName: \"kubernetes.io/projected/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kube-api-access-bnjx4\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:02.675612 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.675513 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:02.675612 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.675537 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-uds\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:02.675612 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.675548 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c-tokenizer-tmp\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:02.679457 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.679439 2571 scope.go:117] "RemoveContainer" containerID="f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0" Apr 23 16:51:02.689542 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.687557 2571 scope.go:117] "RemoveContainer" containerID="54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1" Apr 23 16:51:02.692492 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.692472 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82"] Apr 23 16:51:02.698380 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.698349 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-54c755f4bz82"] Apr 23 16:51:02.698538 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.698517 2571 scope.go:117] "RemoveContainer" containerID="9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5" Apr 23 16:51:02.698811 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:51:02.698791 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5\": container with ID starting with 9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5 not found: ID does not exist" containerID="9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5" Apr 23 16:51:02.698898 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.698819 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5"} err="failed to get container status \"9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5\": rpc error: code = NotFound desc = could not find container \"9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5\": container with ID starting with 9452796723b8d14f7f80bdd8f668252c6fbc6dd4809ccc94aa6326cd5f1757a5 not found: ID does not exist" Apr 23 16:51:02.698898 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.698836 2571 scope.go:117] "RemoveContainer" containerID="f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0" Apr 23 16:51:02.699069 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:51:02.699052 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0\": container with ID starting with f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0 not found: ID does not exist" containerID="f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0" Apr 23 16:51:02.699111 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.699077 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0"} err="failed to get container status \"f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0\": rpc error: code = NotFound desc = could not find container \"f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0\": container with ID starting with f7c72f5ff91167be0193501ce625af11a1ec7fd330e48ff98f646c38d3d339f0 not found: ID does not exist" Apr 23 16:51:02.699111 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.699095 2571 scope.go:117] "RemoveContainer" containerID="54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1" Apr 23 16:51:02.699304 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:51:02.699288 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1\": container with ID starting with 54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1 not found: ID does not exist" containerID="54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1" Apr 23 16:51:02.699342 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:02.699308 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1"} err="failed to get container status \"54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1\": rpc error: code = NotFound desc = could not find container \"54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1\": container with ID starting with 54c972c6764c4174935e62d06586d54d73d4c38858257516dde45ef8d95832c1 not found: ID does not exist" Apr 23 16:51:03.512584 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:03.512547 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b351b34-680d-4b12-863f-78b94ef19798" path="/var/lib/kubelet/pods/6b351b34-680d-4b12-863f-78b94ef19798/volumes" Apr 23 16:51:03.513043 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:03.513027 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" path="/var/lib/kubelet/pods/c8dc3075-18c7-4ecf-ba68-3f9759d5f02c/volumes" Apr 23 16:51:08.967417 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.967364 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2"] Apr 23 16:51:08.967968 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.967949 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b351b34-680d-4b12-863f-78b94ef19798" containerName="storage-initializer" Apr 23 16:51:08.968025 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.967976 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b351b34-680d-4b12-863f-78b94ef19798" containerName="storage-initializer" Apr 23 16:51:08.968025 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968000 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerName="tokenizer" Apr 23 16:51:08.968025 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968006 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerName="tokenizer" Apr 23 16:51:08.968025 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968017 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b351b34-680d-4b12-863f-78b94ef19798" containerName="main" Apr 23 16:51:08.968025 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968023 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b351b34-680d-4b12-863f-78b94ef19798" containerName="main" Apr 23 16:51:08.968184 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968033 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerName="main" Apr 23 16:51:08.968184 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968038 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerName="main" Apr 23 16:51:08.968184 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968054 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerName="storage-initializer" Apr 23 16:51:08.968184 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968060 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerName="storage-initializer" Apr 23 16:51:08.968184 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968127 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerName="tokenizer" Apr 23 16:51:08.968184 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968134 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b351b34-680d-4b12-863f-78b94ef19798" containerName="main" Apr 23 16:51:08.968184 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.968143 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8dc3075-18c7-4ecf-ba68-3f9759d5f02c" containerName="main" Apr 23 16:51:08.971021 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.971002 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:08.973736 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.973686 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 23 16:51:08.982413 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:08.982387 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2"] Apr 23 16:51:09.039653 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.039619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-home\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.039653 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.039655 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-model-cache\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.039906 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.039682 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2498a0-c2b1-430b-9602-9c054e3fab9f-tls-certs\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.039906 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.039730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.039906 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.039797 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-dshm\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.039906 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.039840 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68fc\" (UniqueName: \"kubernetes.io/projected/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kube-api-access-v68fc\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.140971 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.140931 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-home\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.140971 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.140977 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-model-cache\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.141225 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.140996 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2498a0-c2b1-430b-9602-9c054e3fab9f-tls-certs\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.141225 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.141016 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.141225 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.141052 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-dshm\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.141225 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.141082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v68fc\" (UniqueName: \"kubernetes.io/projected/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kube-api-access-v68fc\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.141425 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.141314 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-home\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.141425 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.141334 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.141425 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.141398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-model-cache\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.143543 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.143518 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-dshm\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.143794 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.143777 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2498a0-c2b1-430b-9602-9c054e3fab9f-tls-certs\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.149484 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.149462 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68fc\" (UniqueName: \"kubernetes.io/projected/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kube-api-access-v68fc\") pod \"precise-prefix-cache-test-kserve-589449d9f5-26xb2\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.282591 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.282513 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:09.427125 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.427097 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2"] Apr 23 16:51:09.429281 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:51:09.429247 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f2498a0_c2b1_430b_9602_9c054e3fab9f.slice/crio-b45bb940db25f830cdec69635d499300b7f02fe0a9c576ab321d202f2e8d64d4 WatchSource:0}: Error finding container b45bb940db25f830cdec69635d499300b7f02fe0a9c576ab321d202f2e8d64d4: Status 404 returned error can't find the container with id b45bb940db25f830cdec69635d499300b7f02fe0a9c576ab321d202f2e8d64d4 Apr 23 16:51:09.699833 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.699795 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" event={"ID":"6f2498a0-c2b1-430b-9602-9c054e3fab9f","Type":"ContainerStarted","Data":"6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c"} Apr 23 16:51:09.699833 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:09.699832 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" event={"ID":"6f2498a0-c2b1-430b-9602-9c054e3fab9f","Type":"ContainerStarted","Data":"b45bb940db25f830cdec69635d499300b7f02fe0a9c576ab321d202f2e8d64d4"} Apr 23 16:51:13.717536 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:13.717499 2571 generic.go:358] "Generic (PLEG): container finished" podID="6f2498a0-c2b1-430b-9602-9c054e3fab9f" containerID="6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c" exitCode=0 Apr 23 16:51:13.717896 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:13.717570 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" event={"ID":"6f2498a0-c2b1-430b-9602-9c054e3fab9f","Type":"ContainerDied","Data":"6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c"} Apr 23 16:51:14.722891 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:14.722859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" event={"ID":"6f2498a0-c2b1-430b-9602-9c054e3fab9f","Type":"ContainerStarted","Data":"38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b"} Apr 23 16:51:14.743761 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:14.743714 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" podStartSLOduration=6.743679796 podStartE2EDuration="6.743679796s" podCreationTimestamp="2026-04-23 16:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:51:14.742219702 +0000 UTC m=+957.823662263" watchObservedRunningTime="2026-04-23 16:51:14.743679796 +0000 UTC m=+957.825122365" Apr 23 16:51:19.283038 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:19.283000 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:19.283038 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:19.283042 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:19.295632 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:19.295612 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:19.751107 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:19.751076 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:51.191141 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.191107 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2"] Apr 23 16:51:51.191724 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.191479 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" podUID="6f2498a0-c2b1-430b-9602-9c054e3fab9f" containerName="main" containerID="cri-o://38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b" gracePeriod=30 Apr 23 16:51:51.440225 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.440204 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:51.522674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.522645 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-model-cache\") pod \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " Apr 23 16:51:51.522839 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.522752 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-home\") pod \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " Apr 23 16:51:51.522839 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.522798 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-dshm\") pod \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " Apr 23 16:51:51.522915 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.522857 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kserve-provision-location\") pod \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " Apr 23 16:51:51.522915 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.522885 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v68fc\" (UniqueName: \"kubernetes.io/projected/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kube-api-access-v68fc\") pod \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " Apr 23 16:51:51.522915 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.522881 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-model-cache" (OuterVolumeSpecName: "model-cache") pod "6f2498a0-c2b1-430b-9602-9c054e3fab9f" (UID: "6f2498a0-c2b1-430b-9602-9c054e3fab9f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:51.523074 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.522938 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2498a0-c2b1-430b-9602-9c054e3fab9f-tls-certs\") pod \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\" (UID: \"6f2498a0-c2b1-430b-9602-9c054e3fab9f\") " Apr 23 16:51:51.523074 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.523006 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-home" (OuterVolumeSpecName: "home") pod "6f2498a0-c2b1-430b-9602-9c054e3fab9f" (UID: "6f2498a0-c2b1-430b-9602-9c054e3fab9f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:51.523318 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.523299 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-model-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:51.523385 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.523320 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-home\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:51.525092 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.525062 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f2498a0-c2b1-430b-9602-9c054e3fab9f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6f2498a0-c2b1-430b-9602-9c054e3fab9f" (UID: "6f2498a0-c2b1-430b-9602-9c054e3fab9f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:51:51.525218 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.525102 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-dshm" (OuterVolumeSpecName: "dshm") pod "6f2498a0-c2b1-430b-9602-9c054e3fab9f" (UID: "6f2498a0-c2b1-430b-9602-9c054e3fab9f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:51.525275 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.525217 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kube-api-access-v68fc" (OuterVolumeSpecName: "kube-api-access-v68fc") pod "6f2498a0-c2b1-430b-9602-9c054e3fab9f" (UID: "6f2498a0-c2b1-430b-9602-9c054e3fab9f"). InnerVolumeSpecName "kube-api-access-v68fc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:51:51.576797 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.576768 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6f2498a0-c2b1-430b-9602-9c054e3fab9f" (UID: "6f2498a0-c2b1-430b-9602-9c054e3fab9f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:51.624837 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.624805 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-dshm\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:51.624837 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.624829 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:51.624837 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.624840 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v68fc\" (UniqueName: \"kubernetes.io/projected/6f2498a0-c2b1-430b-9602-9c054e3fab9f-kube-api-access-v68fc\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:51.625135 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.624851 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f2498a0-c2b1-430b-9602-9c054e3fab9f-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:51:51.852944 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.852855 2571 generic.go:358] "Generic (PLEG): container finished" podID="6f2498a0-c2b1-430b-9602-9c054e3fab9f" containerID="38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b" exitCode=0 Apr 23 16:51:51.852944 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.852901 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" event={"ID":"6f2498a0-c2b1-430b-9602-9c054e3fab9f","Type":"ContainerDied","Data":"38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b"} Apr 23 16:51:51.852944 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.852921 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" Apr 23 16:51:51.852944 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.852932 2571 scope.go:117] "RemoveContainer" containerID="38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b" Apr 23 16:51:51.853210 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.852922 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2" event={"ID":"6f2498a0-c2b1-430b-9602-9c054e3fab9f","Type":"ContainerDied","Data":"b45bb940db25f830cdec69635d499300b7f02fe0a9c576ab321d202f2e8d64d4"} Apr 23 16:51:51.861796 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.861780 2571 scope.go:117] "RemoveContainer" containerID="6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c" Apr 23 16:51:51.874060 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.874036 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2"] Apr 23 16:51:51.875925 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.875780 2571 scope.go:117] "RemoveContainer" containerID="38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b" Apr 23 16:51:51.876107 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:51:51.876087 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b\": container with ID starting with 38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b not found: ID does not exist" containerID="38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b" Apr 23 16:51:51.876162 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.876116 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b"} err="failed to get container status \"38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b\": rpc error: code = NotFound desc = could not find container \"38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b\": container with ID starting with 38e9aa4057d9f379d571b22661193b90f9e28b55228181d44e19b2cb7aa9dd3b not found: ID does not exist" Apr 23 16:51:51.876162 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.876134 2571 scope.go:117] "RemoveContainer" containerID="6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c" Apr 23 16:51:51.876397 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:51:51.876380 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c\": container with ID starting with 6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c not found: ID does not exist" containerID="6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c" Apr 23 16:51:51.876473 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.876400 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c"} err="failed to get container status \"6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c\": rpc error: code = NotFound desc = could not find container \"6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c\": container with ID starting with 6e7e100cb4b063aaf71d5c8b168b5c0ee9661e93774cadf6c682713ad7e7052c not found: ID does not exist" Apr 23 16:51:51.877861 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:51.877840 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-589449d9f5-26xb2"] Apr 23 16:51:53.516550 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:51:53.516515 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f2498a0-c2b1-430b-9602-9c054e3fab9f" path="/var/lib/kubelet/pods/6f2498a0-c2b1-430b-9602-9c054e3fab9f/volumes" Apr 23 16:53:28.141088 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:28.141017 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l"] Apr 23 16:53:28.141594 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:28.141393 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" podUID="96a37355-39be-42da-99f2-891a94f08962" containerName="main" containerID="cri-o://69c2d0658c6f227f8709b8bfd262854eddfd000ac72e662536732df9bc5c72b3" gracePeriod=30 Apr 23 16:53:28.141594 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:28.141437 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" podUID="96a37355-39be-42da-99f2-891a94f08962" containerName="tokenizer" containerID="cri-o://b0b27b4103b2183b5e6f7d326285ebf33a1490c21894ea502a9f7eec3e4b2924" gracePeriod=30 Apr 23 16:53:29.190738 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.190690 2571 generic.go:358] "Generic (PLEG): container finished" podID="96a37355-39be-42da-99f2-891a94f08962" containerID="b0b27b4103b2183b5e6f7d326285ebf33a1490c21894ea502a9f7eec3e4b2924" exitCode=0 Apr 23 16:53:29.190738 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.190726 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" event={"ID":"96a37355-39be-42da-99f2-891a94f08962","Type":"ContainerDied","Data":"b0b27b4103b2183b5e6f7d326285ebf33a1490c21894ea502a9f7eec3e4b2924"} Apr 23 16:53:29.191111 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.190765 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" event={"ID":"96a37355-39be-42da-99f2-891a94f08962","Type":"ContainerDied","Data":"69c2d0658c6f227f8709b8bfd262854eddfd000ac72e662536732df9bc5c72b3"} Apr 23 16:53:29.191111 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.190736 2571 generic.go:358] "Generic (PLEG): container finished" podID="96a37355-39be-42da-99f2-891a94f08962" containerID="69c2d0658c6f227f8709b8bfd262854eddfd000ac72e662536732df9bc5c72b3" exitCode=0 Apr 23 16:53:29.294150 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.294124 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:53:29.416621 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.416536 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-cache\") pod \"96a37355-39be-42da-99f2-891a94f08962\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " Apr 23 16:53:29.416621 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.416623 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-uds\") pod \"96a37355-39be-42da-99f2-891a94f08962\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " Apr 23 16:53:29.416890 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.416652 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-tmp\") pod \"96a37355-39be-42da-99f2-891a94f08962\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " Apr 23 16:53:29.416890 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.416679 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96a37355-39be-42da-99f2-891a94f08962-tls-certs\") pod \"96a37355-39be-42da-99f2-891a94f08962\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " Apr 23 16:53:29.416890 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.416721 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb59n\" (UniqueName: \"kubernetes.io/projected/96a37355-39be-42da-99f2-891a94f08962-kube-api-access-sb59n\") pod \"96a37355-39be-42da-99f2-891a94f08962\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " Apr 23 16:53:29.416890 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.416763 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-kserve-provision-location\") pod \"96a37355-39be-42da-99f2-891a94f08962\" (UID: \"96a37355-39be-42da-99f2-891a94f08962\") " Apr 23 16:53:29.417073 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.416926 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "96a37355-39be-42da-99f2-891a94f08962" (UID: "96a37355-39be-42da-99f2-891a94f08962"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:53:29.417073 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.416923 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "96a37355-39be-42da-99f2-891a94f08962" (UID: "96a37355-39be-42da-99f2-891a94f08962"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:53:29.417073 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.417014 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "96a37355-39be-42da-99f2-891a94f08962" (UID: "96a37355-39be-42da-99f2-891a94f08962"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:53:29.417073 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.417052 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:53:29.417218 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.417089 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-uds\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:53:29.417532 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.417511 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "96a37355-39be-42da-99f2-891a94f08962" (UID: "96a37355-39be-42da-99f2-891a94f08962"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:53:29.418989 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.418965 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a37355-39be-42da-99f2-891a94f08962-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "96a37355-39be-42da-99f2-891a94f08962" (UID: "96a37355-39be-42da-99f2-891a94f08962"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:53:29.419102 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.419016 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a37355-39be-42da-99f2-891a94f08962-kube-api-access-sb59n" (OuterVolumeSpecName: "kube-api-access-sb59n") pod "96a37355-39be-42da-99f2-891a94f08962" (UID: "96a37355-39be-42da-99f2-891a94f08962"). InnerVolumeSpecName "kube-api-access-sb59n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:53:29.517776 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.517743 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:53:29.517776 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.517771 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/96a37355-39be-42da-99f2-891a94f08962-tokenizer-tmp\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:53:29.518036 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.517787 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96a37355-39be-42da-99f2-891a94f08962-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:53:29.518036 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:29.517798 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sb59n\" (UniqueName: \"kubernetes.io/projected/96a37355-39be-42da-99f2-891a94f08962-kube-api-access-sb59n\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:53:30.196349 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:30.196311 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" event={"ID":"96a37355-39be-42da-99f2-891a94f08962","Type":"ContainerDied","Data":"649f1f0fb8a8602f64958198afbc88e8f935b2156fe565d13452f03361d87f16"} Apr 23 16:53:30.196349 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:30.196342 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l" Apr 23 16:53:30.196873 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:30.196364 2571 scope.go:117] "RemoveContainer" containerID="b0b27b4103b2183b5e6f7d326285ebf33a1490c21894ea502a9f7eec3e4b2924" Apr 23 16:53:30.204876 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:30.204860 2571 scope.go:117] "RemoveContainer" containerID="69c2d0658c6f227f8709b8bfd262854eddfd000ac72e662536732df9bc5c72b3" Apr 23 16:53:30.212357 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:30.212332 2571 scope.go:117] "RemoveContainer" containerID="230b87c4f0f2f0bda73c0c2735864336118228d5220199e4ea0eb8d9290a865a" Apr 23 16:53:30.215641 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:30.215619 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l"] Apr 23 16:53:30.220313 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:30.220284 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schervn5l"] Apr 23 16:53:31.511533 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:31.511498 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a37355-39be-42da-99f2-891a94f08962" path="/var/lib/kubelet/pods/96a37355-39be-42da-99f2-891a94f08962/volumes" Apr 23 16:53:39.675888 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.675855 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l"] Apr 23 16:53:39.676309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676231 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96a37355-39be-42da-99f2-891a94f08962" containerName="storage-initializer" Apr 23 16:53:39.676309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676244 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a37355-39be-42da-99f2-891a94f08962" containerName="storage-initializer" Apr 23 16:53:39.676309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676251 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96a37355-39be-42da-99f2-891a94f08962" containerName="main" Apr 23 16:53:39.676309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676256 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a37355-39be-42da-99f2-891a94f08962" containerName="main" Apr 23 16:53:39.676309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676274 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96a37355-39be-42da-99f2-891a94f08962" containerName="tokenizer" Apr 23 16:53:39.676309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676279 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a37355-39be-42da-99f2-891a94f08962" containerName="tokenizer" Apr 23 16:53:39.676309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676285 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f2498a0-c2b1-430b-9602-9c054e3fab9f" containerName="main" Apr 23 16:53:39.676309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676290 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2498a0-c2b1-430b-9602-9c054e3fab9f" containerName="main" Apr 23 16:53:39.676309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676298 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f2498a0-c2b1-430b-9602-9c054e3fab9f" containerName="storage-initializer" Apr 23 16:53:39.676309 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676303 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2498a0-c2b1-430b-9602-9c054e3fab9f" containerName="storage-initializer" Apr 23 16:53:39.676598 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676354 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="96a37355-39be-42da-99f2-891a94f08962" containerName="tokenizer" Apr 23 16:53:39.676598 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676363 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="96a37355-39be-42da-99f2-891a94f08962" containerName="main" Apr 23 16:53:39.676598 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.676372 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f2498a0-c2b1-430b-9602-9c054e3fab9f" containerName="main" Apr 23 16:53:39.679767 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.679749 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.682501 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.682477 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 23 16:53:39.682631 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.682551 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b4jmv\"" Apr 23 16:53:39.683473 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.683458 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-f6zb7\"" Apr 23 16:53:39.691783 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.691765 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l"] Apr 23 16:53:39.808918 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.808888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.809074 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.808933 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53510e23-cfaa-49bc-96e6-107a5be846b0-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.809074 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.808975 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.809074 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.809012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44f8g\" (UniqueName: \"kubernetes.io/projected/53510e23-cfaa-49bc-96e6-107a5be846b0-kube-api-access-44f8g\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.809074 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.809049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.809207 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.809095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.909996 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.909963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.909996 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.910012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44f8g\" (UniqueName: \"kubernetes.io/projected/53510e23-cfaa-49bc-96e6-107a5be846b0-kube-api-access-44f8g\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.910230 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.910189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.910281 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.910248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.910343 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.910330 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.910386 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.910378 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53510e23-cfaa-49bc-96e6-107a5be846b0-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.910423 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.910403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.910611 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.910594 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.910646 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.910635 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.910679 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.910643 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.912943 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.912924 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53510e23-cfaa-49bc-96e6-107a5be846b0-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.918009 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.917987 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44f8g\" (UniqueName: \"kubernetes.io/projected/53510e23-cfaa-49bc-96e6-107a5be846b0-kube-api-access-44f8g\") pod \"custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:39.988954 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:39.988917 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:40.116518 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:40.116495 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l"] Apr 23 16:53:40.118788 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:53:40.118758 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53510e23_cfaa_49bc_96e6_107a5be846b0.slice/crio-4253be30469f4ecfea6b2ec47d6f0b6b378603ca9d2f001aa38876c4ed227db2 WatchSource:0}: Error finding container 4253be30469f4ecfea6b2ec47d6f0b6b378603ca9d2f001aa38876c4ed227db2: Status 404 returned error can't find the container with id 4253be30469f4ecfea6b2ec47d6f0b6b378603ca9d2f001aa38876c4ed227db2 Apr 23 16:53:40.121039 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:40.121014 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:53:40.232996 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:40.232959 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" event={"ID":"53510e23-cfaa-49bc-96e6-107a5be846b0","Type":"ContainerStarted","Data":"945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe"} Apr 23 16:53:40.232996 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:40.232996 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" event={"ID":"53510e23-cfaa-49bc-96e6-107a5be846b0","Type":"ContainerStarted","Data":"4253be30469f4ecfea6b2ec47d6f0b6b378603ca9d2f001aa38876c4ed227db2"} Apr 23 16:53:41.237619 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:41.237583 2571 generic.go:358] "Generic (PLEG): container finished" podID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerID="945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe" exitCode=0 Apr 23 16:53:41.238086 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:41.237670 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" event={"ID":"53510e23-cfaa-49bc-96e6-107a5be846b0","Type":"ContainerDied","Data":"945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe"} Apr 23 16:53:42.244615 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:42.244582 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" event={"ID":"53510e23-cfaa-49bc-96e6-107a5be846b0","Type":"ContainerStarted","Data":"b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045"} Apr 23 16:53:42.244615 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:42.244618 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" event={"ID":"53510e23-cfaa-49bc-96e6-107a5be846b0","Type":"ContainerStarted","Data":"1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511"} Apr 23 16:53:42.245072 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:42.244821 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:42.267277 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:42.267230 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" podStartSLOduration=3.267216771 podStartE2EDuration="3.267216771s" podCreationTimestamp="2026-04-23 16:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:53:42.26518124 +0000 UTC m=+1105.346623810" watchObservedRunningTime="2026-04-23 16:53:42.267216771 +0000 UTC m=+1105.348659340" Apr 23 16:53:49.989324 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:49.989287 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:49.989324 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:49.989329 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:49.992040 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:49.992018 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:53:50.276175 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:53:50.276099 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:54:11.285914 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:11.285882 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:54:27.900191 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.900155 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b"] Apr 23 16:54:27.905650 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.905631 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:27.908158 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.908140 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-bqdfs\"" Apr 23 16:54:27.908158 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.908148 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 23 16:54:27.919133 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.919081 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b"] Apr 23 16:54:27.959050 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.959017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:27.959166 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.959053 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:27.959166 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.959083 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:27.959166 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.959153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:27.959265 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.959180 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9efd275f-81a6-43c1-977a-544fe88021a9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:27.959265 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:27.959210 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpqwr\" (UniqueName: \"kubernetes.io/projected/9efd275f-81a6-43c1-977a-544fe88021a9-kube-api-access-kpqwr\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.059717 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.059668 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.059879 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.059732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9efd275f-81a6-43c1-977a-544fe88021a9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.059879 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.059773 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpqwr\" (UniqueName: \"kubernetes.io/projected/9efd275f-81a6-43c1-977a-544fe88021a9-kube-api-access-kpqwr\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.059879 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.059841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.059879 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.059873 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.060115 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.059919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.060170 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.060141 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.060215 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.060199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.060277 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.060223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.060326 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.060280 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.062351 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.062334 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9efd275f-81a6-43c1-977a-544fe88021a9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.070488 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.070464 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpqwr\" (UniqueName: \"kubernetes.io/projected/9efd275f-81a6-43c1-977a-544fe88021a9-kube-api-access-kpqwr\") pod \"stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.220339 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.220299 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:28.348941 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.348915 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b"] Apr 23 16:54:28.350798 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:54:28.350759 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9efd275f_81a6_43c1_977a_544fe88021a9.slice/crio-e2e2108124503e229389b50862ba4eec20037ef542acada2e6bd119662e19860 WatchSource:0}: Error finding container e2e2108124503e229389b50862ba4eec20037ef542acada2e6bd119662e19860: Status 404 returned error can't find the container with id e2e2108124503e229389b50862ba4eec20037ef542acada2e6bd119662e19860 Apr 23 16:54:28.419731 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:28.419664 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" event={"ID":"9efd275f-81a6-43c1-977a-544fe88021a9","Type":"ContainerStarted","Data":"e2e2108124503e229389b50862ba4eec20037ef542acada2e6bd119662e19860"} Apr 23 16:54:29.424470 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:29.424390 2571 generic.go:358] "Generic (PLEG): container finished" podID="9efd275f-81a6-43c1-977a-544fe88021a9" containerID="13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91" exitCode=0 Apr 23 16:54:29.424470 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:29.424455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" event={"ID":"9efd275f-81a6-43c1-977a-544fe88021a9","Type":"ContainerDied","Data":"13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91"} Apr 23 16:54:30.430443 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:30.430403 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" event={"ID":"9efd275f-81a6-43c1-977a-544fe88021a9","Type":"ContainerStarted","Data":"70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56"} Apr 23 16:54:30.430443 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:30.430446 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" event={"ID":"9efd275f-81a6-43c1-977a-544fe88021a9","Type":"ContainerStarted","Data":"3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0"} Apr 23 16:54:30.430933 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:30.430510 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:30.455654 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:30.455596 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" podStartSLOduration=3.455579154 podStartE2EDuration="3.455579154s" podCreationTimestamp="2026-04-23 16:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:54:30.45220536 +0000 UTC m=+1153.533647929" watchObservedRunningTime="2026-04-23 16:54:30.455579154 +0000 UTC m=+1153.537021726" Apr 23 16:54:38.221472 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:38.221393 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:38.221472 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:38.221425 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:38.224290 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:38.224261 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:38.459674 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:38.459647 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:54:59.463273 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:54:59.463242 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:55:17.478654 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:17.478624 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:55:17.482067 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:17.482043 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 16:55:26.270678 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:26.270645 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l"] Apr 23 16:55:26.271334 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:26.271000 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerName="main" containerID="cri-o://1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511" gracePeriod=30 Apr 23 16:55:26.271334 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:26.271023 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerName="tokenizer" containerID="cri-o://b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045" gracePeriod=30 Apr 23 16:55:26.633183 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:26.633095 2571 generic.go:358] "Generic (PLEG): container finished" podID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerID="1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511" exitCode=0 Apr 23 16:55:26.633183 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:26.633169 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" event={"ID":"53510e23-cfaa-49bc-96e6-107a5be846b0","Type":"ContainerDied","Data":"1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511"} Apr 23 16:55:27.437402 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.437374 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:55:27.527631 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.527555 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-uds\") pod \"53510e23-cfaa-49bc-96e6-107a5be846b0\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " Apr 23 16:55:27.527631 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.527605 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-tmp\") pod \"53510e23-cfaa-49bc-96e6-107a5be846b0\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " Apr 23 16:55:27.527858 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.527635 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44f8g\" (UniqueName: \"kubernetes.io/projected/53510e23-cfaa-49bc-96e6-107a5be846b0-kube-api-access-44f8g\") pod \"53510e23-cfaa-49bc-96e6-107a5be846b0\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " Apr 23 16:55:27.527858 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.527738 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53510e23-cfaa-49bc-96e6-107a5be846b0-tls-certs\") pod \"53510e23-cfaa-49bc-96e6-107a5be846b0\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " Apr 23 16:55:27.527858 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.527765 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-kserve-provision-location\") pod \"53510e23-cfaa-49bc-96e6-107a5be846b0\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " Apr 23 16:55:27.527858 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.527789 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "53510e23-cfaa-49bc-96e6-107a5be846b0" (UID: "53510e23-cfaa-49bc-96e6-107a5be846b0"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:55:27.528076 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.527912 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-cache\") pod \"53510e23-cfaa-49bc-96e6-107a5be846b0\" (UID: \"53510e23-cfaa-49bc-96e6-107a5be846b0\") " Apr 23 16:55:27.528076 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.527951 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "53510e23-cfaa-49bc-96e6-107a5be846b0" (UID: "53510e23-cfaa-49bc-96e6-107a5be846b0"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:55:27.528181 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.528141 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "53510e23-cfaa-49bc-96e6-107a5be846b0" (UID: "53510e23-cfaa-49bc-96e6-107a5be846b0"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:55:27.528285 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.528269 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:55:27.528340 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.528286 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-uds\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:55:27.528340 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.528295 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-tokenizer-tmp\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:55:27.528552 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.528529 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "53510e23-cfaa-49bc-96e6-107a5be846b0" (UID: "53510e23-cfaa-49bc-96e6-107a5be846b0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:55:27.529981 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.529958 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53510e23-cfaa-49bc-96e6-107a5be846b0-kube-api-access-44f8g" (OuterVolumeSpecName: "kube-api-access-44f8g") pod "53510e23-cfaa-49bc-96e6-107a5be846b0" (UID: "53510e23-cfaa-49bc-96e6-107a5be846b0"). InnerVolumeSpecName "kube-api-access-44f8g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:55:27.530126 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.529981 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53510e23-cfaa-49bc-96e6-107a5be846b0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "53510e23-cfaa-49bc-96e6-107a5be846b0" (UID: "53510e23-cfaa-49bc-96e6-107a5be846b0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:55:27.629392 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.629358 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-44f8g\" (UniqueName: \"kubernetes.io/projected/53510e23-cfaa-49bc-96e6-107a5be846b0-kube-api-access-44f8g\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:55:27.629392 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.629386 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53510e23-cfaa-49bc-96e6-107a5be846b0-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:55:27.629575 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.629402 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53510e23-cfaa-49bc-96e6-107a5be846b0-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:55:27.640435 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.640402 2571 generic.go:358] "Generic (PLEG): container finished" podID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerID="b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045" exitCode=0 Apr 23 16:55:27.640567 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.640463 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" event={"ID":"53510e23-cfaa-49bc-96e6-107a5be846b0","Type":"ContainerDied","Data":"b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045"} Apr 23 16:55:27.640567 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.640496 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" event={"ID":"53510e23-cfaa-49bc-96e6-107a5be846b0","Type":"ContainerDied","Data":"4253be30469f4ecfea6b2ec47d6f0b6b378603ca9d2f001aa38876c4ed227db2"} Apr 23 16:55:27.640567 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.640509 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l" Apr 23 16:55:27.640567 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.640518 2571 scope.go:117] "RemoveContainer" containerID="b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045" Apr 23 16:55:27.649735 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.649718 2571 scope.go:117] "RemoveContainer" containerID="1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511" Apr 23 16:55:27.657283 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.657264 2571 scope.go:117] "RemoveContainer" containerID="945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe" Apr 23 16:55:27.664609 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.664587 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l"] Apr 23 16:55:27.665864 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.665795 2571 scope.go:117] "RemoveContainer" containerID="b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045" Apr 23 16:55:27.666237 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:55:27.666175 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045\": container with ID starting with b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045 not found: ID does not exist" containerID="b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045" Apr 23 16:55:27.666237 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.666207 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045"} err="failed to get container status \"b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045\": rpc error: code = NotFound desc = could not find container \"b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045\": container with ID starting with b458a2ef0e8fab575ac3dfd7fd7facb865871834edbb1a8f88d1bd052da5e045 not found: ID does not exist" Apr 23 16:55:27.666402 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.666244 2571 scope.go:117] "RemoveContainer" containerID="1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511" Apr 23 16:55:27.666485 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:55:27.666457 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511\": container with ID starting with 1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511 not found: ID does not exist" containerID="1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511" Apr 23 16:55:27.666595 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.666489 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511"} err="failed to get container status \"1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511\": rpc error: code = NotFound desc = could not find container \"1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511\": container with ID starting with 1746738c9aada7a6951b2346d02bcd512d1a6813166718c0117849274e229511 not found: ID does not exist" Apr 23 16:55:27.666595 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.666514 2571 scope.go:117] "RemoveContainer" containerID="945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe" Apr 23 16:55:27.666772 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:55:27.666733 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe\": container with ID starting with 945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe not found: ID does not exist" containerID="945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe" Apr 23 16:55:27.666772 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.666759 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe"} err="failed to get container status \"945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe\": rpc error: code = NotFound desc = could not find container \"945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe\": container with ID starting with 945147a9eaab97479a226e6aec607658fab33ff2bed413e7f684dab920623bfe not found: ID does not exist" Apr 23 16:55:27.668100 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:27.668081 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7467845cbnd2l"] Apr 23 16:55:29.514080 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:29.514042 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" path="/var/lib/kubelet/pods/53510e23-cfaa-49bc-96e6-107a5be846b0/volumes" Apr 23 16:55:41.436101 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.436065 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj"] Apr 23 16:55:41.436462 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.436438 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerName="storage-initializer" Apr 23 16:55:41.436462 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.436449 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerName="storage-initializer" Apr 23 16:55:41.436462 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.436457 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerName="tokenizer" Apr 23 16:55:41.436462 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.436462 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerName="tokenizer" Apr 23 16:55:41.436590 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.436469 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerName="main" Apr 23 16:55:41.436590 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.436474 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerName="main" Apr 23 16:55:41.436590 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.436531 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerName="main" Apr 23 16:55:41.436590 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.436539 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="53510e23-cfaa-49bc-96e6-107a5be846b0" containerName="tokenizer" Apr 23 16:55:41.441538 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.441520 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.444151 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.444126 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-v4pzv\"" Apr 23 16:55:41.444238 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.444150 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 23 16:55:41.448530 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.448501 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj"] Apr 23 16:55:41.540997 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.540964 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.541199 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.541008 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gplk5\" (UniqueName: \"kubernetes.io/projected/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kube-api-access-gplk5\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.541199 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.541056 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.541199 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.541074 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.541199 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.541115 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.541199 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.541143 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.642364 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.642332 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.642523 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.642374 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gplk5\" (UniqueName: \"kubernetes.io/projected/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kube-api-access-gplk5\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.642523 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.642397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.642601 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.642527 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.642601 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.642579 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.642790 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.642755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.642917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.642815 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.642917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.642872 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.642917 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.642893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.643091 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.643037 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.645010 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.644990 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.652008 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.651985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gplk5\" (UniqueName: \"kubernetes.io/projected/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kube-api-access-gplk5\") pod \"router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.752794 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.752748 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:41.876798 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:41.876760 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj"] Apr 23 16:55:41.878241 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:55:41.878204 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9773b45_4ffa_4c28_a187_6cb6e58e85d3.slice/crio-4e0a17b6bf25698919469c441986e466af416f0fe0b0343efb37a1d2a94aa644 WatchSource:0}: Error finding container 4e0a17b6bf25698919469c441986e466af416f0fe0b0343efb37a1d2a94aa644: Status 404 returned error can't find the container with id 4e0a17b6bf25698919469c441986e466af416f0fe0b0343efb37a1d2a94aa644 Apr 23 16:55:42.700016 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:42.699981 2571 generic.go:358] "Generic (PLEG): container finished" podID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerID="ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e" exitCode=0 Apr 23 16:55:42.700480 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:42.700071 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" event={"ID":"a9773b45-4ffa-4c28-a187-6cb6e58e85d3","Type":"ContainerDied","Data":"ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e"} Apr 23 16:55:42.700480 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:42.700119 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" event={"ID":"a9773b45-4ffa-4c28-a187-6cb6e58e85d3","Type":"ContainerStarted","Data":"4e0a17b6bf25698919469c441986e466af416f0fe0b0343efb37a1d2a94aa644"} Apr 23 16:55:43.705672 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:43.705639 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" event={"ID":"a9773b45-4ffa-4c28-a187-6cb6e58e85d3","Type":"ContainerStarted","Data":"ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47"} Apr 23 16:55:43.706132 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:43.705678 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" event={"ID":"a9773b45-4ffa-4c28-a187-6cb6e58e85d3","Type":"ContainerStarted","Data":"e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d"} Apr 23 16:55:43.706132 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:43.705758 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:43.731409 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:43.731336 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" podStartSLOduration=2.731317126 podStartE2EDuration="2.731317126s" podCreationTimestamp="2026-04-23 16:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:55:43.727190108 +0000 UTC m=+1226.808632676" watchObservedRunningTime="2026-04-23 16:55:43.731317126 +0000 UTC m=+1226.812759697" Apr 23 16:55:51.753794 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:51.753755 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:51.753794 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:51.753802 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:51.756548 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:51.756524 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:55:52.737550 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:55:52.737519 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:56:09.329665 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:09.329580 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b"] Apr 23 16:56:09.332394 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:09.330032 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="main" containerID="cri-o://3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0" gracePeriod=30 Apr 23 16:56:09.332394 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:09.330091 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="tokenizer" containerID="cri-o://70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56" gracePeriod=30 Apr 23 16:56:09.462736 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:56:09.462685 2571 logging.go:55] [core] [Channel #254 SubChannel #255]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.41:9003", ServerName: "10.134.0.41:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.41:9003: connect: connection refused" Apr 23 16:56:09.797598 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:09.797566 2571 generic.go:358] "Generic (PLEG): container finished" podID="9efd275f-81a6-43c1-977a-544fe88021a9" containerID="3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0" exitCode=0 Apr 23 16:56:09.797842 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:09.797634 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" event={"ID":"9efd275f-81a6-43c1-977a-544fe88021a9","Type":"ContainerDied","Data":"3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0"} Apr 23 16:56:10.463185 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.463145 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.41:9003\" within 1s: context deadline exceeded" Apr 23 16:56:10.585093 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.585072 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:56:10.716825 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.716796 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-kserve-provision-location\") pod \"9efd275f-81a6-43c1-977a-544fe88021a9\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " Apr 23 16:56:10.716983 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.716892 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9efd275f-81a6-43c1-977a-544fe88021a9-tls-certs\") pod \"9efd275f-81a6-43c1-977a-544fe88021a9\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " Apr 23 16:56:10.716983 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.716933 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-tmp\") pod \"9efd275f-81a6-43c1-977a-544fe88021a9\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " Apr 23 16:56:10.716983 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.716956 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-uds\") pod \"9efd275f-81a6-43c1-977a-544fe88021a9\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " Apr 23 16:56:10.716983 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.716980 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-cache\") pod \"9efd275f-81a6-43c1-977a-544fe88021a9\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " Apr 23 16:56:10.717202 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.717030 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpqwr\" (UniqueName: \"kubernetes.io/projected/9efd275f-81a6-43c1-977a-544fe88021a9-kube-api-access-kpqwr\") pod \"9efd275f-81a6-43c1-977a-544fe88021a9\" (UID: \"9efd275f-81a6-43c1-977a-544fe88021a9\") " Apr 23 16:56:10.717290 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.717262 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "9efd275f-81a6-43c1-977a-544fe88021a9" (UID: "9efd275f-81a6-43c1-977a-544fe88021a9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:56:10.717354 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.717307 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "9efd275f-81a6-43c1-977a-544fe88021a9" (UID: "9efd275f-81a6-43c1-977a-544fe88021a9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:56:10.717354 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.717316 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "9efd275f-81a6-43c1-977a-544fe88021a9" (UID: "9efd275f-81a6-43c1-977a-544fe88021a9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:56:10.717652 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.717632 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9efd275f-81a6-43c1-977a-544fe88021a9" (UID: "9efd275f-81a6-43c1-977a-544fe88021a9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:56:10.719204 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.719183 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9efd275f-81a6-43c1-977a-544fe88021a9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9efd275f-81a6-43c1-977a-544fe88021a9" (UID: "9efd275f-81a6-43c1-977a-544fe88021a9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:56:10.719308 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.719268 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9efd275f-81a6-43c1-977a-544fe88021a9-kube-api-access-kpqwr" (OuterVolumeSpecName: "kube-api-access-kpqwr") pod "9efd275f-81a6-43c1-977a-544fe88021a9" (UID: "9efd275f-81a6-43c1-977a-544fe88021a9"). InnerVolumeSpecName "kube-api-access-kpqwr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:56:10.803363 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.803321 2571 generic.go:358] "Generic (PLEG): container finished" podID="9efd275f-81a6-43c1-977a-544fe88021a9" containerID="70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56" exitCode=0 Apr 23 16:56:10.803524 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.803406 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" Apr 23 16:56:10.803524 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.803406 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" event={"ID":"9efd275f-81a6-43c1-977a-544fe88021a9","Type":"ContainerDied","Data":"70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56"} Apr 23 16:56:10.803524 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.803448 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b" event={"ID":"9efd275f-81a6-43c1-977a-544fe88021a9","Type":"ContainerDied","Data":"e2e2108124503e229389b50862ba4eec20037ef542acada2e6bd119662e19860"} Apr 23 16:56:10.803524 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.803464 2571 scope.go:117] "RemoveContainer" containerID="70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56" Apr 23 16:56:10.812900 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.812880 2571 scope.go:117] "RemoveContainer" containerID="3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0" Apr 23 16:56:10.818199 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.818173 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpqwr\" (UniqueName: \"kubernetes.io/projected/9efd275f-81a6-43c1-977a-544fe88021a9-kube-api-access-kpqwr\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:56:10.818199 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.818196 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:56:10.818330 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.818209 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9efd275f-81a6-43c1-977a-544fe88021a9-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:56:10.818330 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.818221 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-tmp\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:56:10.818330 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.818238 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-uds\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:56:10.818330 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.818246 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9efd275f-81a6-43c1-977a-544fe88021a9-tokenizer-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:56:10.820438 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.820418 2571 scope.go:117] "RemoveContainer" containerID="13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91" Apr 23 16:56:10.827947 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.827918 2571 scope.go:117] "RemoveContainer" containerID="70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56" Apr 23 16:56:10.828186 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:56:10.828169 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56\": container with ID starting with 70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56 not found: ID does not exist" containerID="70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56" Apr 23 16:56:10.828227 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.828195 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56"} err="failed to get container status \"70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56\": rpc error: code = NotFound desc = could not find container \"70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56\": container with ID starting with 70857e70ac1985d96b64d79dcf343ce50cab72b0291eaf1228099f70a629eb56 not found: ID does not exist" Apr 23 16:56:10.828227 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.828213 2571 scope.go:117] "RemoveContainer" containerID="3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0" Apr 23 16:56:10.828441 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:56:10.828421 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0\": container with ID starting with 3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0 not found: ID does not exist" containerID="3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0" Apr 23 16:56:10.828508 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.828450 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0"} err="failed to get container status \"3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0\": rpc error: code = NotFound desc = could not find container \"3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0\": container with ID starting with 3295fee232ca55021445205e69f9051595d48ce33d9c0c7bf39e9ff74a45a7f0 not found: ID does not exist" Apr 23 16:56:10.828508 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.828475 2571 scope.go:117] "RemoveContainer" containerID="13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91" Apr 23 16:56:10.828736 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:56:10.828717 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91\": container with ID starting with 13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91 not found: ID does not exist" containerID="13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91" Apr 23 16:56:10.828791 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.828745 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91"} err="failed to get container status \"13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91\": rpc error: code = NotFound desc = could not find container \"13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91\": container with ID starting with 13c949d76f36fc2bc4a562b9edf974bbca4c20e3db28febea8f43dcde22cbe91 not found: ID does not exist" Apr 23 16:56:10.835187 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.835154 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b"] Apr 23 16:56:10.842499 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:10.842476 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-f558ff9bb-xns5b"] Apr 23 16:56:11.511870 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:11.511834 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" path="/var/lib/kubelet/pods/9efd275f-81a6-43c1-977a-544fe88021a9/volumes" Apr 23 16:56:13.742685 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:56:13.742655 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:57:45.614533 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:45.614498 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj"] Apr 23 16:57:45.615053 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:45.614946 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerName="main" containerID="cri-o://e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d" gracePeriod=30 Apr 23 16:57:45.615053 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:45.614998 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerName="tokenizer" containerID="cri-o://ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47" gracePeriod=30 Apr 23 16:57:46.129996 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.129964 2571 generic.go:358] "Generic (PLEG): container finished" podID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerID="e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d" exitCode=0 Apr 23 16:57:46.130171 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.130044 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" event={"ID":"a9773b45-4ffa-4c28-a187-6cb6e58e85d3","Type":"ContainerDied","Data":"e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d"} Apr 23 16:57:46.763647 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.763629 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:57:46.900166 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.900090 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gplk5\" (UniqueName: \"kubernetes.io/projected/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kube-api-access-gplk5\") pod \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " Apr 23 16:57:46.900166 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.900131 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-uds\") pod \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " Apr 23 16:57:46.900346 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.900180 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kserve-provision-location\") pod \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " Apr 23 16:57:46.900346 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.900228 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-tmp\") pod \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " Apr 23 16:57:46.900346 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.900312 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tls-certs\") pod \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " Apr 23 16:57:46.900475 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.900352 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-cache\") pod \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\" (UID: \"a9773b45-4ffa-4c28-a187-6cb6e58e85d3\") " Apr 23 16:57:46.900536 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.900506 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a9773b45-4ffa-4c28-a187-6cb6e58e85d3" (UID: "a9773b45-4ffa-4c28-a187-6cb6e58e85d3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:57:46.900629 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.900614 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-uds\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:57:46.900629 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.900609 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a9773b45-4ffa-4c28-a187-6cb6e58e85d3" (UID: "a9773b45-4ffa-4c28-a187-6cb6e58e85d3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:57:46.900773 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.900724 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a9773b45-4ffa-4c28-a187-6cb6e58e85d3" (UID: "a9773b45-4ffa-4c28-a187-6cb6e58e85d3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:57:46.901195 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.901168 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a9773b45-4ffa-4c28-a187-6cb6e58e85d3" (UID: "a9773b45-4ffa-4c28-a187-6cb6e58e85d3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:57:46.902432 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.902410 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kube-api-access-gplk5" (OuterVolumeSpecName: "kube-api-access-gplk5") pod "a9773b45-4ffa-4c28-a187-6cb6e58e85d3" (UID: "a9773b45-4ffa-4c28-a187-6cb6e58e85d3"). InnerVolumeSpecName "kube-api-access-gplk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:57:46.902594 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:46.902569 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a9773b45-4ffa-4c28-a187-6cb6e58e85d3" (UID: "a9773b45-4ffa-4c28-a187-6cb6e58e85d3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:57:47.001826 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.001799 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:57:47.001826 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.001823 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-tmp\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:57:47.001961 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.001833 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:57:47.001961 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.001841 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-tokenizer-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:57:47.001961 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.001849 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gplk5\" (UniqueName: \"kubernetes.io/projected/a9773b45-4ffa-4c28-a187-6cb6e58e85d3-kube-api-access-gplk5\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 16:57:47.134840 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.134810 2571 generic.go:358] "Generic (PLEG): container finished" podID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerID="ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47" exitCode=0 Apr 23 16:57:47.134979 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.134846 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" event={"ID":"a9773b45-4ffa-4c28-a187-6cb6e58e85d3","Type":"ContainerDied","Data":"ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47"} Apr 23 16:57:47.134979 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.134888 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" event={"ID":"a9773b45-4ffa-4c28-a187-6cb6e58e85d3","Type":"ContainerDied","Data":"4e0a17b6bf25698919469c441986e466af416f0fe0b0343efb37a1d2a94aa644"} Apr 23 16:57:47.134979 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.134892 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj" Apr 23 16:57:47.134979 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.134907 2571 scope.go:117] "RemoveContainer" containerID="ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47" Apr 23 16:57:47.143894 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.143877 2571 scope.go:117] "RemoveContainer" containerID="e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d" Apr 23 16:57:47.151368 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.151350 2571 scope.go:117] "RemoveContainer" containerID="ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e" Apr 23 16:57:47.158427 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.158389 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj"] Apr 23 16:57:47.162170 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.162148 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-6cb48db9dd-t5tqj"] Apr 23 16:57:47.162949 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.162936 2571 scope.go:117] "RemoveContainer" containerID="ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47" Apr 23 16:57:47.163199 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:57:47.163182 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47\": container with ID starting with ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47 not found: ID does not exist" containerID="ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47" Apr 23 16:57:47.163256 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.163207 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47"} err="failed to get container status \"ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47\": rpc error: code = NotFound desc = could not find container \"ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47\": container with ID starting with ccfb42ff1e08fb7cc376f37d446e0fb0e8af1090d1314d5835027f3efe8fdd47 not found: ID does not exist" Apr 23 16:57:47.163256 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.163224 2571 scope.go:117] "RemoveContainer" containerID="e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d" Apr 23 16:57:47.163463 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:57:47.163449 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d\": container with ID starting with e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d not found: ID does not exist" containerID="e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d" Apr 23 16:57:47.163505 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.163467 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d"} err="failed to get container status \"e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d\": rpc error: code = NotFound desc = could not find container \"e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d\": container with ID starting with e9a8572c477707c2d8a885874bbe4c3053df7f0b44b92bf520ae7ce67d65983d not found: ID does not exist" Apr 23 16:57:47.163505 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.163492 2571 scope.go:117] "RemoveContainer" containerID="ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e" Apr 23 16:57:47.163749 ip-10-0-136-27 kubenswrapper[2571]: E0423 16:57:47.163727 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e\": container with ID starting with ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e not found: ID does not exist" containerID="ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e" Apr 23 16:57:47.163793 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.163755 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e"} err="failed to get container status \"ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e\": rpc error: code = NotFound desc = could not find container \"ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e\": container with ID starting with ea924d8f1595ff3f0d693bc1745c273a0dd3defd14aee6dff0a13376ae34ad1e not found: ID does not exist" Apr 23 16:57:47.512913 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:47.512878 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" path="/var/lib/kubelet/pods/a9773b45-4ffa-4c28-a187-6cb6e58e85d3/volumes" Apr 23 16:57:50.547534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.547496 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv"] Apr 23 16:57:50.548035 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548016 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="tokenizer" Apr 23 16:57:50.548116 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548039 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="tokenizer" Apr 23 16:57:50.548116 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548065 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="storage-initializer" Apr 23 16:57:50.548116 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548075 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="storage-initializer" Apr 23 16:57:50.548116 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548086 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="main" Apr 23 16:57:50.548116 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548094 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="main" Apr 23 16:57:50.548116 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548111 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerName="storage-initializer" Apr 23 16:57:50.548410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548121 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerName="storage-initializer" Apr 23 16:57:50.548410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548140 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerName="main" Apr 23 16:57:50.548410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548150 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerName="main" Apr 23 16:57:50.548410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548160 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerName="tokenizer" Apr 23 16:57:50.548410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548170 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerName="tokenizer" Apr 23 16:57:50.548410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548269 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="tokenizer" Apr 23 16:57:50.548410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548284 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9efd275f-81a6-43c1-977a-544fe88021a9" containerName="main" Apr 23 16:57:50.548410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548294 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerName="main" Apr 23 16:57:50.548410 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.548307 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9773b45-4ffa-4c28-a187-6cb6e58e85d3" containerName="tokenizer" Apr 23 16:57:50.553181 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.553158 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.555746 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.555728 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-jgjmn\"" Apr 23 16:57:50.556867 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.556850 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b4jmv\"" Apr 23 16:57:50.556941 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.556888 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 23 16:57:50.561190 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.561172 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv"] Apr 23 16:57:50.643000 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.642972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.643174 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.643019 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.643174 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.643043 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cp8s\" (UniqueName: \"kubernetes.io/projected/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kube-api-access-5cp8s\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.643174 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.643155 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.643285 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.643207 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.643285 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.643261 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.743826 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.743785 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.743979 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.743837 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.743979 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.743952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cp8s\" (UniqueName: \"kubernetes.io/projected/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kube-api-access-5cp8s\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.744079 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.744064 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.744138 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.744123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.744192 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.744177 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.744241 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.744199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.744412 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.744392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.744483 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.744470 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.744540 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.744526 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.746507 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.746485 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.753296 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.753272 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cp8s\" (UniqueName: \"kubernetes.io/projected/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kube-api-access-5cp8s\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.863758 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.863651 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:50.990649 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:50.990625 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv"] Apr 23 16:57:50.993133 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:57:50.993107 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d5c8a9_5326_4c88_b4a2_e1699bf55d9a.slice/crio-13aac1952d80842db93a67957a5c1c49341ddcf93a02a01e72cb4abe9f72b161 WatchSource:0}: Error finding container 13aac1952d80842db93a67957a5c1c49341ddcf93a02a01e72cb4abe9f72b161: Status 404 returned error can't find the container with id 13aac1952d80842db93a67957a5c1c49341ddcf93a02a01e72cb4abe9f72b161 Apr 23 16:57:51.152819 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:51.152730 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" event={"ID":"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a","Type":"ContainerStarted","Data":"03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80"} Apr 23 16:57:51.152819 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:51.152769 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" event={"ID":"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a","Type":"ContainerStarted","Data":"13aac1952d80842db93a67957a5c1c49341ddcf93a02a01e72cb4abe9f72b161"} Apr 23 16:57:52.159087 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:52.158999 2571 generic.go:358] "Generic (PLEG): container finished" podID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerID="03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80" exitCode=0 Apr 23 16:57:52.159441 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:52.159097 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" event={"ID":"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a","Type":"ContainerDied","Data":"03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80"} Apr 23 16:57:53.164423 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:53.164390 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" event={"ID":"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a","Type":"ContainerStarted","Data":"19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365"} Apr 23 16:57:53.164860 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:53.164428 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" event={"ID":"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a","Type":"ContainerStarted","Data":"1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd"} Apr 23 16:57:53.164860 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:53.164565 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:57:53.186323 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:57:53.186280 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" podStartSLOduration=3.186261719 podStartE2EDuration="3.186261719s" podCreationTimestamp="2026-04-23 16:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:57:53.184827591 +0000 UTC m=+1356.266270172" watchObservedRunningTime="2026-04-23 16:57:53.186261719 +0000 UTC m=+1356.267704288" Apr 23 16:58:00.864737 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:58:00.864673 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:58:00.865225 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:58:00.864748 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:58:00.867621 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:58:00.867596 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:58:01.196222 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:58:01.196198 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:58:22.200181 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:58:22.200150 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 16:59:58.208195 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.208158 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 16:59:58.211934 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.211912 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.214401 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.214376 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-fr79q\"" Apr 23 16:59:58.215478 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.215460 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 23 16:59:58.219718 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.219339 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 16:59:58.277412 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.277378 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz"] Apr 23 16:59:58.281151 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.281129 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.283907 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.283886 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-r7sx8\"" Apr 23 16:59:58.290240 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.290205 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4720d42-33f3-4de6-afc0-553e3f79c727-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.290450 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.290428 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p6nj\" (UniqueName: \"kubernetes.io/projected/d4720d42-33f3-4de6-afc0-553e3f79c727-kube-api-access-6p6nj\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.290628 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.290610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxhxm\" (UniqueName: \"kubernetes.io/projected/284e7c02-9985-4226-b5a4-02015d44ebc3-kube-api-access-dxhxm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.290808 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.290790 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.290973 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.290948 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.291095 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.291008 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/284e7c02-9985-4226-b5a4-02015d44ebc3-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.291185 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.291116 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.291264 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.291191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.291264 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.291246 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.291376 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.291287 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.291376 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.291338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.291484 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.291368 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.291484 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.291459 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz"] Apr 23 16:59:58.392707 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.392659 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4720d42-33f3-4de6-afc0-553e3f79c727-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.392911 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.392724 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6p6nj\" (UniqueName: \"kubernetes.io/projected/d4720d42-33f3-4de6-afc0-553e3f79c727-kube-api-access-6p6nj\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.392911 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.392755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxhxm\" (UniqueName: \"kubernetes.io/projected/284e7c02-9985-4226-b5a4-02015d44ebc3-kube-api-access-dxhxm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.392911 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.392772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.393168 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.392891 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.393168 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.392969 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/284e7c02-9985-4226-b5a4-02015d44ebc3-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.393168 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393001 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.393168 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.393168 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393088 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.393168 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.393168 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393155 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.393534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393168 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.393534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.393534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393216 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.393534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.393534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393453 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.393534 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393454 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.394050 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393537 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.394050 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.393646 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.395403 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.395378 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4720d42-33f3-4de6-afc0-553e3f79c727-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.395403 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.395396 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.395688 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.395666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/284e7c02-9985-4226-b5a4-02015d44ebc3-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.401230 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.401204 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p6nj\" (UniqueName: \"kubernetes.io/projected/d4720d42-33f3-4de6-afc0-553e3f79c727-kube-api-access-6p6nj\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.401377 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.401356 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxhxm\" (UniqueName: \"kubernetes.io/projected/284e7c02-9985-4226-b5a4-02015d44ebc3-kube-api-access-dxhxm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.524039 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.523949 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 16:59:58.593544 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.593492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 16:59:58.661577 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.661399 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 16:59:58.665131 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:59:58.665096 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod284e7c02_9985_4226_b5a4_02015d44ebc3.slice/crio-8e4db29a4244911c905390b241a5a49bee8d52af87bc9d2da949c77853258718 WatchSource:0}: Error finding container 8e4db29a4244911c905390b241a5a49bee8d52af87bc9d2da949c77853258718: Status 404 returned error can't find the container with id 8e4db29a4244911c905390b241a5a49bee8d52af87bc9d2da949c77853258718 Apr 23 16:59:58.667399 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.667375 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:59:58.737664 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:58.737640 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz"] Apr 23 16:59:58.739844 ip-10-0-136-27 kubenswrapper[2571]: W0423 16:59:58.739812 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4720d42_33f3_4de6_afc0_553e3f79c727.slice/crio-98d8b9ef45669b7d4b166198472ec90b57fcc57c1a8c75f80263067b14ba3cd9 WatchSource:0}: Error finding container 98d8b9ef45669b7d4b166198472ec90b57fcc57c1a8c75f80263067b14ba3cd9: Status 404 returned error can't find the container with id 98d8b9ef45669b7d4b166198472ec90b57fcc57c1a8c75f80263067b14ba3cd9 Apr 23 16:59:59.600138 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:59.600096 2571 generic.go:358] "Generic (PLEG): container finished" podID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerID="e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6" exitCode=0 Apr 23 16:59:59.600585 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:59.600172 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" event={"ID":"d4720d42-33f3-4de6-afc0-553e3f79c727","Type":"ContainerDied","Data":"e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6"} Apr 23 16:59:59.600585 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:59.600208 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" event={"ID":"d4720d42-33f3-4de6-afc0-553e3f79c727","Type":"ContainerStarted","Data":"98d8b9ef45669b7d4b166198472ec90b57fcc57c1a8c75f80263067b14ba3cd9"} Apr 23 16:59:59.601943 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:59.601916 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"284e7c02-9985-4226-b5a4-02015d44ebc3","Type":"ContainerStarted","Data":"fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e"} Apr 23 16:59:59.602029 ip-10-0-136-27 kubenswrapper[2571]: I0423 16:59:59.601952 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"284e7c02-9985-4226-b5a4-02015d44ebc3","Type":"ContainerStarted","Data":"8e4db29a4244911c905390b241a5a49bee8d52af87bc9d2da949c77853258718"} Apr 23 17:00:00.607502 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:00.607453 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" event={"ID":"d4720d42-33f3-4de6-afc0-553e3f79c727","Type":"ContainerStarted","Data":"424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3"} Apr 23 17:00:00.607502 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:00.607494 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" event={"ID":"d4720d42-33f3-4de6-afc0-553e3f79c727","Type":"ContainerStarted","Data":"33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4"} Apr 23 17:00:00.633576 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:00.633519 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" podStartSLOduration=2.633499087 podStartE2EDuration="2.633499087s" podCreationTimestamp="2026-04-23 16:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:00:00.632380904 +0000 UTC m=+1483.713823500" watchObservedRunningTime="2026-04-23 17:00:00.633499087 +0000 UTC m=+1483.714941657" Apr 23 17:00:01.611345 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:01.611306 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 17:00:03.619354 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:03.619321 2571 generic.go:358] "Generic (PLEG): container finished" podID="284e7c02-9985-4226-b5a4-02015d44ebc3" containerID="fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e" exitCode=0 Apr 23 17:00:03.619751 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:03.619374 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"284e7c02-9985-4226-b5a4-02015d44ebc3","Type":"ContainerDied","Data":"fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e"} Apr 23 17:00:08.594604 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:08.594567 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 17:00:08.595150 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:08.594617 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 17:00:08.596415 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:08.596118 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.45:8082/healthz\": dial tcp 10.134.0.45:8082: connect: connection refused" Apr 23 17:00:17.509367 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:17.509331 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 17:00:17.515233 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:17.515208 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 17:00:18.596360 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:18.596327 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 17:00:18.597810 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:18.597787 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 17:00:31.741677 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:31.741639 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"284e7c02-9985-4226-b5a4-02015d44ebc3","Type":"ContainerStarted","Data":"be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191"} Apr 23 17:00:31.767616 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:31.767563 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.387745944 podStartE2EDuration="33.767548293s" podCreationTimestamp="2026-04-23 16:59:58 +0000 UTC" firstStartedPulling="2026-04-23 17:00:03.6207448 +0000 UTC m=+1486.702187361" lastFinishedPulling="2026-04-23 17:00:31.000547132 +0000 UTC m=+1514.081989710" observedRunningTime="2026-04-23 17:00:31.76612083 +0000 UTC m=+1514.847563399" watchObservedRunningTime="2026-04-23 17:00:31.767548293 +0000 UTC m=+1514.848990862" Apr 23 17:00:38.689481 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:38.689403 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 17:00:52.968650 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:52.968615 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv"] Apr 23 17:00:52.969240 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:52.968936 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerName="main" containerID="cri-o://1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd" gracePeriod=30 Apr 23 17:00:52.969240 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:52.969070 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerName="tokenizer" containerID="cri-o://19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365" gracePeriod=30 Apr 23 17:00:53.823125 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:53.823085 2571 generic.go:358] "Generic (PLEG): container finished" podID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerID="1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd" exitCode=0 Apr 23 17:00:53.823299 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:53.823160 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" event={"ID":"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a","Type":"ContainerDied","Data":"1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd"} Apr 23 17:00:54.340357 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.340330 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 17:00:54.410126 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410036 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kserve-provision-location\") pod \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " Apr 23 17:00:54.410126 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410101 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-tmp\") pod \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " Apr 23 17:00:54.410382 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410130 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-uds\") pod \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " Apr 23 17:00:54.410382 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410224 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tls-certs\") pod \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " Apr 23 17:00:54.410382 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410298 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-cache\") pod \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " Apr 23 17:00:54.410382 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410330 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cp8s\" (UniqueName: \"kubernetes.io/projected/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kube-api-access-5cp8s\") pod \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\" (UID: \"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a\") " Apr 23 17:00:54.410593 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410446 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" (UID: "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:00:54.410593 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410493 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" (UID: "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:00:54.410593 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410535 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" (UID: "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:00:54.410777 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410717 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:00:54.410777 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410735 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-tmp\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:00:54.410777 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410744 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tokenizer-uds\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:00:54.410926 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.410903 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" (UID: "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:00:54.412644 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.412611 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" (UID: "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:00:54.412786 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.412645 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kube-api-access-5cp8s" (OuterVolumeSpecName: "kube-api-access-5cp8s") pod "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" (UID: "73d5c8a9-5326-4c88-b4a2-e1699bf55d9a"). InnerVolumeSpecName "kube-api-access-5cp8s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:00:54.511645 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.511607 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5cp8s\" (UniqueName: \"kubernetes.io/projected/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kube-api-access-5cp8s\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:00:54.511645 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.511637 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:00:54.511645 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.511649 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:00:54.828645 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.828613 2571 generic.go:358] "Generic (PLEG): container finished" podID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerID="19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365" exitCode=0 Apr 23 17:00:54.828848 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.828679 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" event={"ID":"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a","Type":"ContainerDied","Data":"19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365"} Apr 23 17:00:54.828848 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.828727 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" event={"ID":"73d5c8a9-5326-4c88-b4a2-e1699bf55d9a","Type":"ContainerDied","Data":"13aac1952d80842db93a67957a5c1c49341ddcf93a02a01e72cb4abe9f72b161"} Apr 23 17:00:54.828848 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.828743 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv" Apr 23 17:00:54.829007 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.828748 2571 scope.go:117] "RemoveContainer" containerID="19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365" Apr 23 17:00:54.837529 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.837508 2571 scope.go:117] "RemoveContainer" containerID="1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd" Apr 23 17:00:54.846491 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.846474 2571 scope.go:117] "RemoveContainer" containerID="03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80" Apr 23 17:00:54.854594 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.854571 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv"] Apr 23 17:00:54.855191 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.855161 2571 scope.go:117] "RemoveContainer" containerID="19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365" Apr 23 17:00:54.855669 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:00:54.855632 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365\": container with ID starting with 19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365 not found: ID does not exist" containerID="19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365" Apr 23 17:00:54.855791 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.855676 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365"} err="failed to get container status \"19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365\": rpc error: code = NotFound desc = could not find container \"19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365\": container with ID starting with 19b500b4e3ccaf9952190284281f885e3b1e4cb1db08e32bf050447bffeb0365 not found: ID does not exist" Apr 23 17:00:54.855791 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.855748 2571 scope.go:117] "RemoveContainer" containerID="1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd" Apr 23 17:00:54.856076 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:00:54.856047 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd\": container with ID starting with 1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd not found: ID does not exist" containerID="1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd" Apr 23 17:00:54.856186 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.856083 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd"} err="failed to get container status \"1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd\": rpc error: code = NotFound desc = could not find container \"1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd\": container with ID starting with 1c2b0d084db89bbe43b1c26e00977b15c08dcee2d4922c6239b2b3b01cfa77fd not found: ID does not exist" Apr 23 17:00:54.856186 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.856105 2571 scope.go:117] "RemoveContainer" containerID="03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80" Apr 23 17:00:54.856422 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:00:54.856399 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80\": container with ID starting with 03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80 not found: ID does not exist" containerID="03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80" Apr 23 17:00:54.856540 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.856426 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80"} err="failed to get container status \"03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80\": rpc error: code = NotFound desc = could not find container \"03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80\": container with ID starting with 03a17210695f21a7c4b0e1cdc6b24543d1098b6eec8259d448f7ff5c05678d80 not found: ID does not exist" Apr 23 17:00:54.857994 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:54.857974 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepqcpv"] Apr 23 17:00:55.512577 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:00:55.512539 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" path="/var/lib/kubelet/pods/73d5c8a9-5326-4c88-b4a2-e1699bf55d9a/volumes" Apr 23 17:01:00.429972 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.429939 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj"] Apr 23 17:01:00.430465 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.430313 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerName="tokenizer" Apr 23 17:01:00.430465 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.430325 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerName="tokenizer" Apr 23 17:01:00.430465 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.430348 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerName="main" Apr 23 17:01:00.430465 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.430353 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerName="main" Apr 23 17:01:00.430465 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.430362 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerName="storage-initializer" Apr 23 17:01:00.430465 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.430368 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerName="storage-initializer" Apr 23 17:01:00.430465 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.430435 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerName="tokenizer" Apr 23 17:01:00.430465 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.430441 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="73d5c8a9-5326-4c88-b4a2-e1699bf55d9a" containerName="main" Apr 23 17:01:00.435285 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.435253 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.437905 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.437879 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 23 17:01:00.438045 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.437933 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-p5q8p\"" Apr 23 17:01:00.448280 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.448255 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj"] Apr 23 17:01:00.469854 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.469828 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-home\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.470011 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.469885 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt5cc\" (UniqueName: \"kubernetes.io/projected/e146ba56-294d-4f3e-98aa-de03e20906dd-kube-api-access-rt5cc\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.470011 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.469939 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.470011 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.469964 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e146ba56-294d-4f3e-98aa-de03e20906dd-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.470011 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.469989 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-model-cache\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.470011 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.470007 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-dshm\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.570524 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.570484 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-home\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.570753 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.570568 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt5cc\" (UniqueName: \"kubernetes.io/projected/e146ba56-294d-4f3e-98aa-de03e20906dd-kube-api-access-rt5cc\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.570753 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.570611 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.570753 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.570640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e146ba56-294d-4f3e-98aa-de03e20906dd-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.570753 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.570669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-model-cache\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.570753 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.570720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-dshm\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.571064 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.571020 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-home\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.571164 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.571138 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.571227 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.571182 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-model-cache\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.573212 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.573185 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-dshm\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.573548 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.573527 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e146ba56-294d-4f3e-98aa-de03e20906dd-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.578941 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.578915 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt5cc\" (UniqueName: \"kubernetes.io/projected/e146ba56-294d-4f3e-98aa-de03e20906dd-kube-api-access-rt5cc\") pod \"custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.748384 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.748348 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:00.808744 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.808649 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4"] Apr 23 17:01:00.816177 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.816147 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.820216 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.820001 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-p6v8r\"" Apr 23 17:01:00.831802 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.831768 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4"] Apr 23 17:01:00.873517 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.873487 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.873517 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.873529 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.873843 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.873617 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.873843 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.873772 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.873978 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.873851 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.873978 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.873937 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xr9v\" (UniqueName: \"kubernetes.io/projected/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kube-api-access-9xr9v\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.936398 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.936370 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj"] Apr 23 17:01:00.937373 ip-10-0-136-27 kubenswrapper[2571]: W0423 17:01:00.937349 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode146ba56_294d_4f3e_98aa_de03e20906dd.slice/crio-cafadcf3a308036ee501addf26ebe43baf7317168f0898b5972236800675d27b WatchSource:0}: Error finding container cafadcf3a308036ee501addf26ebe43baf7317168f0898b5972236800675d27b: Status 404 returned error can't find the container with id cafadcf3a308036ee501addf26ebe43baf7317168f0898b5972236800675d27b Apr 23 17:01:00.974665 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.974637 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xr9v\" (UniqueName: \"kubernetes.io/projected/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kube-api-access-9xr9v\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.974875 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.974718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.974875 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.974743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.974875 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.974763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.974875 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.974846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.975092 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.974907 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.975238 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.975213 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.975310 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.975283 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.975310 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.975300 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.975552 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.975523 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.977646 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.977623 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:00.985571 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:00.985548 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xr9v\" (UniqueName: \"kubernetes.io/projected/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kube-api-access-9xr9v\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:01.130341 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:01.130243 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:01.277734 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:01.277679 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4"] Apr 23 17:01:01.279474 ip-10-0-136-27 kubenswrapper[2571]: W0423 17:01:01.279445 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83208b4d_bbd1_402e_98b5_cbfe523bcc53.slice/crio-51764889bb4a513b86372c4c5c19f33e58edc8dca21cc945c8f59bec493b5c14 WatchSource:0}: Error finding container 51764889bb4a513b86372c4c5c19f33e58edc8dca21cc945c8f59bec493b5c14: Status 404 returned error can't find the container with id 51764889bb4a513b86372c4c5c19f33e58edc8dca21cc945c8f59bec493b5c14 Apr 23 17:01:01.860858 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:01.860817 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" event={"ID":"83208b4d-bbd1-402e-98b5-cbfe523bcc53","Type":"ContainerStarted","Data":"f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044"} Apr 23 17:01:01.860858 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:01.860860 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" event={"ID":"83208b4d-bbd1-402e-98b5-cbfe523bcc53","Type":"ContainerStarted","Data":"51764889bb4a513b86372c4c5c19f33e58edc8dca21cc945c8f59bec493b5c14"} Apr 23 17:01:01.861936 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:01.861910 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" event={"ID":"e146ba56-294d-4f3e-98aa-de03e20906dd","Type":"ContainerStarted","Data":"cafadcf3a308036ee501addf26ebe43baf7317168f0898b5972236800675d27b"} Apr 23 17:01:02.866629 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:02.866593 2571 generic.go:358] "Generic (PLEG): container finished" podID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerID="f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044" exitCode=0 Apr 23 17:01:02.867161 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:02.866679 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" event={"ID":"83208b4d-bbd1-402e-98b5-cbfe523bcc53","Type":"ContainerDied","Data":"f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044"} Apr 23 17:01:03.874257 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:03.874216 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" event={"ID":"83208b4d-bbd1-402e-98b5-cbfe523bcc53","Type":"ContainerStarted","Data":"ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af"} Apr 23 17:01:03.874257 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:03.874262 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" event={"ID":"83208b4d-bbd1-402e-98b5-cbfe523bcc53","Type":"ContainerStarted","Data":"541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f"} Apr 23 17:01:03.874670 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:03.874391 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:03.897808 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:03.897762 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" podStartSLOduration=3.897747967 podStartE2EDuration="3.897747967s" podCreationTimestamp="2026-04-23 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:01:03.897448939 +0000 UTC m=+1546.978891543" watchObservedRunningTime="2026-04-23 17:01:03.897747967 +0000 UTC m=+1546.979190536" Apr 23 17:01:09.905242 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:09.905199 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" event={"ID":"e146ba56-294d-4f3e-98aa-de03e20906dd","Type":"ContainerStarted","Data":"9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a"} Apr 23 17:01:09.905788 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:09.905336 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:10.911615 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:10.911577 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" event={"ID":"e146ba56-294d-4f3e-98aa-de03e20906dd","Type":"ContainerStarted","Data":"2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1"} Apr 23 17:01:11.130452 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:11.130413 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:11.130648 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:11.130467 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:11.133929 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:11.133902 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:11.916627 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:11.916588 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:14.926995 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:14.926956 2571 generic.go:358] "Generic (PLEG): container finished" podID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerID="2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1" exitCode=0 Apr 23 17:01:14.927372 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:14.927015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" event={"ID":"e146ba56-294d-4f3e-98aa-de03e20906dd","Type":"ContainerDied","Data":"2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1"} Apr 23 17:01:15.932961 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:15.932918 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" event={"ID":"e146ba56-294d-4f3e-98aa-de03e20906dd","Type":"ContainerStarted","Data":"f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0"} Apr 23 17:01:15.957383 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:15.957323 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podStartSLOduration=7.491234521 podStartE2EDuration="15.957302862s" podCreationTimestamp="2026-04-23 17:01:00 +0000 UTC" firstStartedPulling="2026-04-23 17:01:00.93989234 +0000 UTC m=+1544.021334904" lastFinishedPulling="2026-04-23 17:01:09.405960699 +0000 UTC m=+1552.487403245" observedRunningTime="2026-04-23 17:01:15.954441206 +0000 UTC m=+1559.035883787" watchObservedRunningTime="2026-04-23 17:01:15.957302862 +0000 UTC m=+1559.038745432" Apr 23 17:01:20.749220 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:20.749171 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:20.749220 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:20.749227 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:20.750786 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:20.750744 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 23 17:01:20.762396 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:20.762369 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:01:30.748889 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:30.748840 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 23 17:01:40.749305 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:40.749262 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 23 17:01:42.920820 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:42.920791 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:01:50.749157 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:50.749107 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 23 17:01:56.360223 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:56.360186 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz"] Apr 23 17:01:56.360746 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:56.360608 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="main" containerID="cri-o://33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4" gracePeriod=30 Apr 23 17:01:56.360831 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:56.360755 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="tokenizer" containerID="cri-o://424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3" gracePeriod=30 Apr 23 17:01:57.099326 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.099282 2571 generic.go:358] "Generic (PLEG): container finished" podID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerID="33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4" exitCode=0 Apr 23 17:01:57.099326 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.099320 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" event={"ID":"d4720d42-33f3-4de6-afc0-553e3f79c727","Type":"ContainerDied","Data":"33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4"} Apr 23 17:01:57.731247 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.731223 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 17:01:57.781913 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.781821 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-kserve-provision-location\") pod \"d4720d42-33f3-4de6-afc0-553e3f79c727\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " Apr 23 17:01:57.781913 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.781882 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p6nj\" (UniqueName: \"kubernetes.io/projected/d4720d42-33f3-4de6-afc0-553e3f79c727-kube-api-access-6p6nj\") pod \"d4720d42-33f3-4de6-afc0-553e3f79c727\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " Apr 23 17:01:57.782163 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.781940 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-tmp\") pod \"d4720d42-33f3-4de6-afc0-553e3f79c727\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " Apr 23 17:01:57.782163 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.781983 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-cache\") pod \"d4720d42-33f3-4de6-afc0-553e3f79c727\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " Apr 23 17:01:57.782163 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.782039 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4720d42-33f3-4de6-afc0-553e3f79c727-tls-certs\") pod \"d4720d42-33f3-4de6-afc0-553e3f79c727\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " Apr 23 17:01:57.782163 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.782100 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-uds\") pod \"d4720d42-33f3-4de6-afc0-553e3f79c727\" (UID: \"d4720d42-33f3-4de6-afc0-553e3f79c727\") " Apr 23 17:01:57.782501 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.782403 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d4720d42-33f3-4de6-afc0-553e3f79c727" (UID: "d4720d42-33f3-4de6-afc0-553e3f79c727"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:57.782501 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.782489 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d4720d42-33f3-4de6-afc0-553e3f79c727" (UID: "d4720d42-33f3-4de6-afc0-553e3f79c727"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:57.782649 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.782594 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d4720d42-33f3-4de6-afc0-553e3f79c727" (UID: "d4720d42-33f3-4de6-afc0-553e3f79c727"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:57.782909 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.782886 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d4720d42-33f3-4de6-afc0-553e3f79c727" (UID: "d4720d42-33f3-4de6-afc0-553e3f79c727"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:57.784690 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.784662 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4720d42-33f3-4de6-afc0-553e3f79c727-kube-api-access-6p6nj" (OuterVolumeSpecName: "kube-api-access-6p6nj") pod "d4720d42-33f3-4de6-afc0-553e3f79c727" (UID: "d4720d42-33f3-4de6-afc0-553e3f79c727"). InnerVolumeSpecName "kube-api-access-6p6nj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:01:57.785000 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.784978 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4720d42-33f3-4de6-afc0-553e3f79c727-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d4720d42-33f3-4de6-afc0-553e3f79c727" (UID: "d4720d42-33f3-4de6-afc0-553e3f79c727"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:01:57.883226 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.883188 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-tmp\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:01:57.883226 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.883224 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:01:57.883226 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.883234 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4720d42-33f3-4de6-afc0-553e3f79c727-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:01:57.883507 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.883244 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-tokenizer-uds\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:01:57.883507 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.883255 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4720d42-33f3-4de6-afc0-553e3f79c727-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:01:57.883507 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:57.883265 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6p6nj\" (UniqueName: \"kubernetes.io/projected/d4720d42-33f3-4de6-afc0-553e3f79c727-kube-api-access-6p6nj\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:01:58.105689 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.105598 2571 generic.go:358] "Generic (PLEG): container finished" podID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerID="424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3" exitCode=0 Apr 23 17:01:58.105869 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.105729 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" Apr 23 17:01:58.105869 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.105724 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" event={"ID":"d4720d42-33f3-4de6-afc0-553e3f79c727","Type":"ContainerDied","Data":"424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3"} Apr 23 17:01:58.105869 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.105846 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz" event={"ID":"d4720d42-33f3-4de6-afc0-553e3f79c727","Type":"ContainerDied","Data":"98d8b9ef45669b7d4b166198472ec90b57fcc57c1a8c75f80263067b14ba3cd9"} Apr 23 17:01:58.105869 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.105866 2571 scope.go:117] "RemoveContainer" containerID="424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3" Apr 23 17:01:58.116264 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.116240 2571 scope.go:117] "RemoveContainer" containerID="33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4" Apr 23 17:01:58.124788 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.124766 2571 scope.go:117] "RemoveContainer" containerID="e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6" Apr 23 17:01:58.131804 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.131777 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz"] Apr 23 17:01:58.135653 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.135629 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schelfprz"] Apr 23 17:01:58.136619 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.136600 2571 scope.go:117] "RemoveContainer" containerID="424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3" Apr 23 17:01:58.137078 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:01:58.137047 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3\": container with ID starting with 424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3 not found: ID does not exist" containerID="424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3" Apr 23 17:01:58.137173 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.137092 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3"} err="failed to get container status \"424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3\": rpc error: code = NotFound desc = could not find container \"424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3\": container with ID starting with 424ef288ba7005024bdcfdce6ea2e6fa0f7f131f1111f30b7cfdeb7e5a3dedc3 not found: ID does not exist" Apr 23 17:01:58.137173 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.137117 2571 scope.go:117] "RemoveContainer" containerID="33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4" Apr 23 17:01:58.137429 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:01:58.137412 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4\": container with ID starting with 33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4 not found: ID does not exist" containerID="33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4" Apr 23 17:01:58.137505 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.137439 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4"} err="failed to get container status \"33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4\": rpc error: code = NotFound desc = could not find container \"33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4\": container with ID starting with 33c8bbeb814721b113b9974af45628fe8c606fa05750c6f897941e1c1e7abdc4 not found: ID does not exist" Apr 23 17:01:58.137505 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.137459 2571 scope.go:117] "RemoveContainer" containerID="e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6" Apr 23 17:01:58.137780 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:01:58.137756 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6\": container with ID starting with e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6 not found: ID does not exist" containerID="e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6" Apr 23 17:01:58.137848 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.137791 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6"} err="failed to get container status \"e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6\": rpc error: code = NotFound desc = could not find container \"e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6\": container with ID starting with e16debfaafc2afbcc78a84354d254d395221dc32f66428cf6d76c4c2086e64e6 not found: ID does not exist" Apr 23 17:01:58.196087 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.196047 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 17:01:58.196384 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:58.196358 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="284e7c02-9985-4226-b5a4-02015d44ebc3" containerName="main" containerID="cri-o://be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191" gracePeriod=30 Apr 23 17:01:59.512604 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:01:59.512568 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" path="/var/lib/kubelet/pods/d4720d42-33f3-4de6-afc0-553e3f79c727/volumes" Apr 23 17:02:00.749030 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:00.748975 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 23 17:02:10.749026 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:10.748914 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 23 17:02:20.749715 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:20.749640 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 23 17:02:28.859923 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.859890 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_284e7c02-9985-4226-b5a4-02015d44ebc3/main/0.log" Apr 23 17:02:28.860358 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.860337 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 17:02:28.905200 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.905157 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-home\") pod \"284e7c02-9985-4226-b5a4-02015d44ebc3\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " Apr 23 17:02:28.905387 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.905282 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxhxm\" (UniqueName: \"kubernetes.io/projected/284e7c02-9985-4226-b5a4-02015d44ebc3-kube-api-access-dxhxm\") pod \"284e7c02-9985-4226-b5a4-02015d44ebc3\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " Apr 23 17:02:28.905454 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.905405 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/284e7c02-9985-4226-b5a4-02015d44ebc3-tls-certs\") pod \"284e7c02-9985-4226-b5a4-02015d44ebc3\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " Apr 23 17:02:28.905514 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.905493 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-model-cache\") pod \"284e7c02-9985-4226-b5a4-02015d44ebc3\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " Apr 23 17:02:28.905596 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.905583 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-dshm\") pod \"284e7c02-9985-4226-b5a4-02015d44ebc3\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " Apr 23 17:02:28.905653 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.905626 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-kserve-provision-location\") pod \"284e7c02-9985-4226-b5a4-02015d44ebc3\" (UID: \"284e7c02-9985-4226-b5a4-02015d44ebc3\") " Apr 23 17:02:28.905818 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.905787 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-home" (OuterVolumeSpecName: "home") pod "284e7c02-9985-4226-b5a4-02015d44ebc3" (UID: "284e7c02-9985-4226-b5a4-02015d44ebc3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:28.905887 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.905820 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-model-cache" (OuterVolumeSpecName: "model-cache") pod "284e7c02-9985-4226-b5a4-02015d44ebc3" (UID: "284e7c02-9985-4226-b5a4-02015d44ebc3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:28.906039 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.906017 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-model-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:02:28.906039 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.906038 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-home\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:02:28.907832 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.907801 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284e7c02-9985-4226-b5a4-02015d44ebc3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "284e7c02-9985-4226-b5a4-02015d44ebc3" (UID: "284e7c02-9985-4226-b5a4-02015d44ebc3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:02:28.907832 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.907815 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284e7c02-9985-4226-b5a4-02015d44ebc3-kube-api-access-dxhxm" (OuterVolumeSpecName: "kube-api-access-dxhxm") pod "284e7c02-9985-4226-b5a4-02015d44ebc3" (UID: "284e7c02-9985-4226-b5a4-02015d44ebc3"). InnerVolumeSpecName "kube-api-access-dxhxm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:02:28.907955 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.907892 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-dshm" (OuterVolumeSpecName: "dshm") pod "284e7c02-9985-4226-b5a4-02015d44ebc3" (UID: "284e7c02-9985-4226-b5a4-02015d44ebc3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:28.966323 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:28.966228 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "284e7c02-9985-4226-b5a4-02015d44ebc3" (UID: "284e7c02-9985-4226-b5a4-02015d44ebc3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:29.007364 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.007335 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-dshm\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:02:29.007364 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.007365 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/284e7c02-9985-4226-b5a4-02015d44ebc3-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:02:29.007534 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.007376 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dxhxm\" (UniqueName: \"kubernetes.io/projected/284e7c02-9985-4226-b5a4-02015d44ebc3-kube-api-access-dxhxm\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:02:29.007534 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.007389 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/284e7c02-9985-4226-b5a4-02015d44ebc3-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:02:29.231916 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.231843 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_284e7c02-9985-4226-b5a4-02015d44ebc3/main/0.log" Apr 23 17:02:29.232207 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.232186 2571 generic.go:358] "Generic (PLEG): container finished" podID="284e7c02-9985-4226-b5a4-02015d44ebc3" containerID="be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191" exitCode=137 Apr 23 17:02:29.232258 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.232250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"284e7c02-9985-4226-b5a4-02015d44ebc3","Type":"ContainerDied","Data":"be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191"} Apr 23 17:02:29.232295 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.232276 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"284e7c02-9985-4226-b5a4-02015d44ebc3","Type":"ContainerDied","Data":"8e4db29a4244911c905390b241a5a49bee8d52af87bc9d2da949c77853258718"} Apr 23 17:02:29.232295 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.232283 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 17:02:29.232295 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.232292 2571 scope.go:117] "RemoveContainer" containerID="be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191" Apr 23 17:02:29.252935 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.252913 2571 scope.go:117] "RemoveContainer" containerID="fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e" Apr 23 17:02:29.258724 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.258683 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 17:02:29.264724 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.264683 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 17:02:29.314296 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.314276 2571 scope.go:117] "RemoveContainer" containerID="be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191" Apr 23 17:02:29.314618 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:02:29.314596 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191\": container with ID starting with be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191 not found: ID does not exist" containerID="be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191" Apr 23 17:02:29.314682 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.314625 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191"} err="failed to get container status \"be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191\": rpc error: code = NotFound desc = could not find container \"be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191\": container with ID starting with be0a18bee39e1760e0763635f2aca292e4dd06f4eb1afe7e46ec9f8ad9055191 not found: ID does not exist" Apr 23 17:02:29.314682 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.314643 2571 scope.go:117] "RemoveContainer" containerID="fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e" Apr 23 17:02:29.314934 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:02:29.314917 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e\": container with ID starting with fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e not found: ID does not exist" containerID="fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e" Apr 23 17:02:29.314988 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.314937 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e"} err="failed to get container status \"fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e\": rpc error: code = NotFound desc = could not find container \"fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e\": container with ID starting with fa2f79c964215c5a53ac8cbe88f2a956cc84eede3cbdfe3af1d98c328c573a1e not found: ID does not exist" Apr 23 17:02:29.512277 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:29.512194 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="284e7c02-9985-4226-b5a4-02015d44ebc3" path="/var/lib/kubelet/pods/284e7c02-9985-4226-b5a4-02015d44ebc3/volumes" Apr 23 17:02:30.749209 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:30.749158 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 23 17:02:40.749182 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:40.749129 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 23 17:02:50.759789 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:50.759760 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:02:50.779529 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:02:50.779503 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:03:02.109163 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:02.109123 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4"] Apr 23 17:03:02.110151 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:02.109532 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="main" containerID="cri-o://541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f" gracePeriod=30 Apr 23 17:03:02.110151 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:02.109597 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="tokenizer" containerID="cri-o://ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af" gracePeriod=30 Apr 23 17:03:02.116505 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:02.116477 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj"] Apr 23 17:03:02.116889 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:02.116863 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" containerID="cri-o://f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0" gracePeriod=30 Apr 23 17:03:02.361474 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:02.361386 2571 generic.go:358] "Generic (PLEG): container finished" podID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerID="541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f" exitCode=0 Apr 23 17:03:02.361633 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:02.361461 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" event={"ID":"83208b4d-bbd1-402e-98b5-cbfe523bcc53","Type":"ContainerDied","Data":"541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f"} Apr 23 17:03:02.919314 ip-10-0-136-27 kubenswrapper[2571]: W0423 17:03:02.919285 2571 logging.go:55] [core] [Channel #535 SubChannel #536]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.47:9003", ServerName: "10.134.0.47:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.47:9003: connect: connection refused" Apr 23 17:03:03.356681 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.356662 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:03:03.365913 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.365886 2571 generic.go:358] "Generic (PLEG): container finished" podID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerID="ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af" exitCode=0 Apr 23 17:03:03.366045 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.365943 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" Apr 23 17:03:03.366045 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.366018 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" event={"ID":"83208b4d-bbd1-402e-98b5-cbfe523bcc53","Type":"ContainerDied","Data":"ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af"} Apr 23 17:03:03.366124 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.366065 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" event={"ID":"83208b4d-bbd1-402e-98b5-cbfe523bcc53","Type":"ContainerDied","Data":"51764889bb4a513b86372c4c5c19f33e58edc8dca21cc945c8f59bec493b5c14"} Apr 23 17:03:03.366124 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.366086 2571 scope.go:117] "RemoveContainer" containerID="ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af" Apr 23 17:03:03.374408 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.374378 2571 scope.go:117] "RemoveContainer" containerID="541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f" Apr 23 17:03:03.383309 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.383292 2571 scope.go:117] "RemoveContainer" containerID="f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044" Apr 23 17:03:03.390844 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.390827 2571 scope.go:117] "RemoveContainer" containerID="ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af" Apr 23 17:03:03.391137 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:03:03.391119 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af\": container with ID starting with ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af not found: ID does not exist" containerID="ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af" Apr 23 17:03:03.391185 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.391145 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af"} err="failed to get container status \"ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af\": rpc error: code = NotFound desc = could not find container \"ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af\": container with ID starting with ecd5cc93ace5c304de48cbfe8d0a6e469e07fbc6096453575381b24b33e289af not found: ID does not exist" Apr 23 17:03:03.391185 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.391163 2571 scope.go:117] "RemoveContainer" containerID="541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f" Apr 23 17:03:03.391383 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:03:03.391370 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f\": container with ID starting with 541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f not found: ID does not exist" containerID="541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f" Apr 23 17:03:03.391432 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.391388 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f"} err="failed to get container status \"541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f\": rpc error: code = NotFound desc = could not find container \"541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f\": container with ID starting with 541f12e95666d25098271f02a52ca34ea8b6270bf74563ba39dae4c611df2a4f not found: ID does not exist" Apr 23 17:03:03.391432 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.391402 2571 scope.go:117] "RemoveContainer" containerID="f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044" Apr 23 17:03:03.391635 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:03:03.391618 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044\": container with ID starting with f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044 not found: ID does not exist" containerID="f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044" Apr 23 17:03:03.391677 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.391640 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044"} err="failed to get container status \"f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044\": rpc error: code = NotFound desc = could not find container \"f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044\": container with ID starting with f24cb90d3a6b81bfed6fa3056d881814a591cedab541aba1c4295d0a702a8044 not found: ID does not exist" Apr 23 17:03:03.447012 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.446987 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-tmp\") pod \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " Apr 23 17:03:03.447125 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.447052 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xr9v\" (UniqueName: \"kubernetes.io/projected/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kube-api-access-9xr9v\") pod \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " Apr 23 17:03:03.447125 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.447107 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tls-certs\") pod \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " Apr 23 17:03:03.447243 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.447134 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-uds\") pod \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " Apr 23 17:03:03.447243 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.447158 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kserve-provision-location\") pod \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " Apr 23 17:03:03.447243 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.447199 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-cache\") pod \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\" (UID: \"83208b4d-bbd1-402e-98b5-cbfe523bcc53\") " Apr 23 17:03:03.447389 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.447284 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "83208b4d-bbd1-402e-98b5-cbfe523bcc53" (UID: "83208b4d-bbd1-402e-98b5-cbfe523bcc53"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:03.447512 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.447441 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "83208b4d-bbd1-402e-98b5-cbfe523bcc53" (UID: "83208b4d-bbd1-402e-98b5-cbfe523bcc53"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:03.447512 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.447480 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-tmp\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:03.447611 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.447480 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "83208b4d-bbd1-402e-98b5-cbfe523bcc53" (UID: "83208b4d-bbd1-402e-98b5-cbfe523bcc53"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:03.447902 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.447882 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "83208b4d-bbd1-402e-98b5-cbfe523bcc53" (UID: "83208b4d-bbd1-402e-98b5-cbfe523bcc53"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:03.449288 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.449270 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kube-api-access-9xr9v" (OuterVolumeSpecName: "kube-api-access-9xr9v") pod "83208b4d-bbd1-402e-98b5-cbfe523bcc53" (UID: "83208b4d-bbd1-402e-98b5-cbfe523bcc53"). InnerVolumeSpecName "kube-api-access-9xr9v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:03:03.449372 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.449299 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "83208b4d-bbd1-402e-98b5-cbfe523bcc53" (UID: "83208b4d-bbd1-402e-98b5-cbfe523bcc53"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:03:03.548171 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.548141 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:03.548171 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.548167 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9xr9v\" (UniqueName: \"kubernetes.io/projected/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kube-api-access-9xr9v\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:03.548328 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.548179 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:03.548328 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.548190 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-tokenizer-uds\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:03.548328 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.548198 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83208b4d-bbd1-402e-98b5-cbfe523bcc53-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:03.684921 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.684889 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4"] Apr 23 17:03:03.688345 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.688321 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4"] Apr 23 17:03:03.919526 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:03.919436 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6664dhgvw4" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.47:9003\" within 1s: context deadline exceeded" Apr 23 17:03:04.970718 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.970668 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp"] Apr 23 17:03:04.971132 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971068 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="tokenizer" Apr 23 17:03:04.971132 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971086 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="tokenizer" Apr 23 17:03:04.971132 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971105 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="tokenizer" Apr 23 17:03:04.971132 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971113 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="tokenizer" Apr 23 17:03:04.971132 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971119 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="284e7c02-9985-4226-b5a4-02015d44ebc3" containerName="storage-initializer" Apr 23 17:03:04.971132 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971125 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="284e7c02-9985-4226-b5a4-02015d44ebc3" containerName="storage-initializer" Apr 23 17:03:04.971132 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971131 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="284e7c02-9985-4226-b5a4-02015d44ebc3" containerName="main" Apr 23 17:03:04.971132 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971136 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="284e7c02-9985-4226-b5a4-02015d44ebc3" containerName="main" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971155 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="main" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971160 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="main" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971166 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="storage-initializer" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971170 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="storage-initializer" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971177 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="main" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971182 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="main" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971193 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="storage-initializer" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971198 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="storage-initializer" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971249 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="main" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971257 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="tokenizer" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971263 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4720d42-33f3-4de6-afc0-553e3f79c727" containerName="main" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971270 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" containerName="tokenizer" Apr 23 17:03:04.971448 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.971277 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="284e7c02-9985-4226-b5a4-02015d44ebc3" containerName="main" Apr 23 17:03:04.975832 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.975807 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:04.978585 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.978559 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 23 17:03:04.989089 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:04.989064 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp"] Apr 23 17:03:05.061888 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.061855 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/217a393c-a2a8-4637-ab52-c6f0023f77e1-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.062025 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.061991 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.062070 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.062029 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.062106 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.062073 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kc9r\" (UniqueName: \"kubernetes.io/projected/217a393c-a2a8-4637-ab52-c6f0023f77e1-kube-api-access-9kc9r\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.062190 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.062161 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-home\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.062234 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.062206 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.163794 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.163748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kc9r\" (UniqueName: \"kubernetes.io/projected/217a393c-a2a8-4637-ab52-c6f0023f77e1-kube-api-access-9kc9r\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.163985 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.163829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-home\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.163985 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.163860 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.163985 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.163895 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/217a393c-a2a8-4637-ab52-c6f0023f77e1-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.164152 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.164011 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.164152 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.164040 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.164282 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.164258 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-home\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.164368 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.164327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.164368 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.164360 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.167051 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.167024 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.167324 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.167299 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/217a393c-a2a8-4637-ab52-c6f0023f77e1-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.171436 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.171414 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kc9r\" (UniqueName: \"kubernetes.io/projected/217a393c-a2a8-4637-ab52-c6f0023f77e1-kube-api-access-9kc9r\") pod \"router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.290404 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.290321 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:05.422102 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.422074 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp"] Apr 23 17:03:05.423965 ip-10-0-136-27 kubenswrapper[2571]: W0423 17:03:05.423927 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod217a393c_a2a8_4637_ab52_c6f0023f77e1.slice/crio-6652e0d8d68d0df14b9dd3ecc651d6a46851479f7f37401cd6b69433820c8051 WatchSource:0}: Error finding container 6652e0d8d68d0df14b9dd3ecc651d6a46851479f7f37401cd6b69433820c8051: Status 404 returned error can't find the container with id 6652e0d8d68d0df14b9dd3ecc651d6a46851479f7f37401cd6b69433820c8051 Apr 23 17:03:05.511954 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:05.511924 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83208b4d-bbd1-402e-98b5-cbfe523bcc53" path="/var/lib/kubelet/pods/83208b4d-bbd1-402e-98b5-cbfe523bcc53/volumes" Apr 23 17:03:06.380208 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:06.380165 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" event={"ID":"217a393c-a2a8-4637-ab52-c6f0023f77e1","Type":"ContainerStarted","Data":"665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011"} Apr 23 17:03:06.380208 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:06.380205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" event={"ID":"217a393c-a2a8-4637-ab52-c6f0023f77e1","Type":"ContainerStarted","Data":"6652e0d8d68d0df14b9dd3ecc651d6a46851479f7f37401cd6b69433820c8051"} Apr 23 17:03:10.398229 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:10.398196 2571 generic.go:358] "Generic (PLEG): container finished" podID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerID="665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011" exitCode=0 Apr 23 17:03:10.398646 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:10.398272 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" event={"ID":"217a393c-a2a8-4637-ab52-c6f0023f77e1","Type":"ContainerDied","Data":"665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011"} Apr 23 17:03:11.404366 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:11.404332 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" event={"ID":"217a393c-a2a8-4637-ab52-c6f0023f77e1","Type":"ContainerStarted","Data":"59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66"} Apr 23 17:03:11.425771 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:11.425687 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podStartSLOduration=7.425666887 podStartE2EDuration="7.425666887s" podCreationTimestamp="2026-04-23 17:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:03:11.423715061 +0000 UTC m=+1674.505157626" watchObservedRunningTime="2026-04-23 17:03:11.425666887 +0000 UTC m=+1674.507109457" Apr 23 17:03:15.291040 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:15.291005 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:15.291040 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:15.291049 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:03:15.292562 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:15.292524 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 17:03:25.291113 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:25.291072 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 17:03:32.117037 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.116972 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="llm-d-routing-sidecar" containerID="cri-o://9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a" gracePeriod=2 Apr 23 17:03:32.384723 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.384682 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj_e146ba56-294d-4f3e-98aa-de03e20906dd/main/0.log" Apr 23 17:03:32.385451 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.385432 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:03:32.434050 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.434010 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-dshm\") pod \"e146ba56-294d-4f3e-98aa-de03e20906dd\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " Apr 23 17:03:32.434264 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.434057 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-model-cache\") pod \"e146ba56-294d-4f3e-98aa-de03e20906dd\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " Apr 23 17:03:32.434264 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.434089 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt5cc\" (UniqueName: \"kubernetes.io/projected/e146ba56-294d-4f3e-98aa-de03e20906dd-kube-api-access-rt5cc\") pod \"e146ba56-294d-4f3e-98aa-de03e20906dd\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " Apr 23 17:03:32.434264 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.434126 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-kserve-provision-location\") pod \"e146ba56-294d-4f3e-98aa-de03e20906dd\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " Apr 23 17:03:32.434264 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.434162 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e146ba56-294d-4f3e-98aa-de03e20906dd-tls-certs\") pod \"e146ba56-294d-4f3e-98aa-de03e20906dd\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " Apr 23 17:03:32.434264 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.434209 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-home\") pod \"e146ba56-294d-4f3e-98aa-de03e20906dd\" (UID: \"e146ba56-294d-4f3e-98aa-de03e20906dd\") " Apr 23 17:03:32.434527 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.434363 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-model-cache" (OuterVolumeSpecName: "model-cache") pod "e146ba56-294d-4f3e-98aa-de03e20906dd" (UID: "e146ba56-294d-4f3e-98aa-de03e20906dd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:32.434629 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.434532 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-model-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.434733 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.434679 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-home" (OuterVolumeSpecName: "home") pod "e146ba56-294d-4f3e-98aa-de03e20906dd" (UID: "e146ba56-294d-4f3e-98aa-de03e20906dd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:32.436561 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.436535 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-dshm" (OuterVolumeSpecName: "dshm") pod "e146ba56-294d-4f3e-98aa-de03e20906dd" (UID: "e146ba56-294d-4f3e-98aa-de03e20906dd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:32.436561 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.436540 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e146ba56-294d-4f3e-98aa-de03e20906dd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e146ba56-294d-4f3e-98aa-de03e20906dd" (UID: "e146ba56-294d-4f3e-98aa-de03e20906dd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:03:32.436849 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.436664 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e146ba56-294d-4f3e-98aa-de03e20906dd-kube-api-access-rt5cc" (OuterVolumeSpecName: "kube-api-access-rt5cc") pod "e146ba56-294d-4f3e-98aa-de03e20906dd" (UID: "e146ba56-294d-4f3e-98aa-de03e20906dd"). InnerVolumeSpecName "kube-api-access-rt5cc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:03:32.487858 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.487826 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj_e146ba56-294d-4f3e-98aa-de03e20906dd/main/0.log" Apr 23 17:03:32.488488 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.488465 2571 generic.go:358] "Generic (PLEG): container finished" podID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerID="f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0" exitCode=137 Apr 23 17:03:32.488488 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.488487 2571 generic.go:358] "Generic (PLEG): container finished" podID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerID="9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a" exitCode=0 Apr 23 17:03:32.488625 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.488536 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" event={"ID":"e146ba56-294d-4f3e-98aa-de03e20906dd","Type":"ContainerDied","Data":"f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0"} Apr 23 17:03:32.488625 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.488568 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" event={"ID":"e146ba56-294d-4f3e-98aa-de03e20906dd","Type":"ContainerDied","Data":"9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a"} Apr 23 17:03:32.488625 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.488582 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" event={"ID":"e146ba56-294d-4f3e-98aa-de03e20906dd","Type":"ContainerDied","Data":"cafadcf3a308036ee501addf26ebe43baf7317168f0898b5972236800675d27b"} Apr 23 17:03:32.488625 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.488600 2571 scope.go:117] "RemoveContainer" containerID="f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0" Apr 23 17:03:32.488625 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.488608 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj" Apr 23 17:03:32.489180 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.489149 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e146ba56-294d-4f3e-98aa-de03e20906dd" (UID: "e146ba56-294d-4f3e-98aa-de03e20906dd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:03:32.514955 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.514935 2571 scope.go:117] "RemoveContainer" containerID="2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1" Apr 23 17:03:32.535418 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.535373 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e146ba56-294d-4f3e-98aa-de03e20906dd-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.535566 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.535523 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-home\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.535566 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.535545 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-dshm\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.535566 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.535555 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rt5cc\" (UniqueName: \"kubernetes.io/projected/e146ba56-294d-4f3e-98aa-de03e20906dd-kube-api-access-rt5cc\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.535566 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.535565 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e146ba56-294d-4f3e-98aa-de03e20906dd-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:03:32.574441 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.574423 2571 scope.go:117] "RemoveContainer" containerID="9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a" Apr 23 17:03:32.584684 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.584663 2571 scope.go:117] "RemoveContainer" containerID="f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0" Apr 23 17:03:32.585016 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:03:32.584992 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0\": container with ID starting with f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0 not found: ID does not exist" containerID="f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0" Apr 23 17:03:32.585073 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.585025 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0"} err="failed to get container status \"f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0\": rpc error: code = NotFound desc = could not find container \"f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0\": container with ID starting with f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0 not found: ID does not exist" Apr 23 17:03:32.585073 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.585052 2571 scope.go:117] "RemoveContainer" containerID="2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1" Apr 23 17:03:32.585286 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:03:32.585269 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1\": container with ID starting with 2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1 not found: ID does not exist" containerID="2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1" Apr 23 17:03:32.585343 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.585291 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1"} err="failed to get container status \"2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1\": rpc error: code = NotFound desc = could not find container \"2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1\": container with ID starting with 2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1 not found: ID does not exist" Apr 23 17:03:32.585343 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.585307 2571 scope.go:117] "RemoveContainer" containerID="9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a" Apr 23 17:03:32.585515 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:03:32.585500 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a\": container with ID starting with 9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a not found: ID does not exist" containerID="9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a" Apr 23 17:03:32.585554 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.585518 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a"} err="failed to get container status \"9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a\": rpc error: code = NotFound desc = could not find container \"9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a\": container with ID starting with 9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a not found: ID does not exist" Apr 23 17:03:32.585554 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.585530 2571 scope.go:117] "RemoveContainer" containerID="f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0" Apr 23 17:03:32.585784 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.585764 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0"} err="failed to get container status \"f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0\": rpc error: code = NotFound desc = could not find container \"f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0\": container with ID starting with f71f07cf91bbd326d0898c36f36e868d8e7cbcdce887abd38cf7c846c454d4b0 not found: ID does not exist" Apr 23 17:03:32.585784 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.585784 2571 scope.go:117] "RemoveContainer" containerID="2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1" Apr 23 17:03:32.586010 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.585996 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1"} err="failed to get container status \"2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1\": rpc error: code = NotFound desc = could not find container \"2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1\": container with ID starting with 2e9125aafaac1ed12fe134e8926ccf5045a21a472b79f4e6fba77175e343d9b1 not found: ID does not exist" Apr 23 17:03:32.586053 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.586010 2571 scope.go:117] "RemoveContainer" containerID="9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a" Apr 23 17:03:32.586279 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.586263 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a"} err="failed to get container status \"9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a\": rpc error: code = NotFound desc = could not find container \"9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a\": container with ID starting with 9b4f1cbcfc5180058093f97019b0ab1774ccadcb97d97f56ec36d887a1ee4f3a not found: ID does not exist" Apr 23 17:03:32.818114 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.818086 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj"] Apr 23 17:03:32.823402 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:32.823376 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-7884cf8f7d-7rmqj"] Apr 23 17:03:33.511519 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:33.511484 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" path="/var/lib/kubelet/pods/e146ba56-294d-4f3e-98aa-de03e20906dd/volumes" Apr 23 17:03:35.291231 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:35.291190 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 17:03:45.291493 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:45.291441 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 17:03:55.291357 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:03:55.291271 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 17:04:05.291076 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:05.291033 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 17:04:15.290963 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:15.290917 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 17:04:25.291546 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:25.291500 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 17:04:35.291589 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:35.291544 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 17:04:45.300472 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:45.300442 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:04:45.308038 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:45.308009 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:04:56.167375 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:56.167343 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp"] Apr 23 17:04:56.167879 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:56.167793 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" containerID="cri-o://59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66" gracePeriod=30 Apr 23 17:04:57.248275 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.248242 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wjw6m/must-gather-k64cm"] Apr 23 17:04:57.248781 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.248635 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="llm-d-routing-sidecar" Apr 23 17:04:57.248781 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.248646 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="llm-d-routing-sidecar" Apr 23 17:04:57.248781 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.248659 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="storage-initializer" Apr 23 17:04:57.248781 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.248664 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="storage-initializer" Apr 23 17:04:57.248781 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.248670 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" Apr 23 17:04:57.248781 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.248675 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" Apr 23 17:04:57.249070 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.248796 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="llm-d-routing-sidecar" Apr 23 17:04:57.249070 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.248806 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e146ba56-294d-4f3e-98aa-de03e20906dd" containerName="main" Apr 23 17:04:57.252127 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.252112 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wjw6m/must-gather-k64cm" Apr 23 17:04:57.255079 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.255055 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wjw6m\"/\"kube-root-ca.crt\"" Apr 23 17:04:57.255215 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.255097 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wjw6m\"/\"openshift-service-ca.crt\"" Apr 23 17:04:57.255215 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.255157 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wjw6m\"/\"default-dockercfg-jqj7s\"" Apr 23 17:04:57.261353 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.261324 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wjw6m/must-gather-k64cm"] Apr 23 17:04:57.356713 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.356660 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvl6l\" (UniqueName: \"kubernetes.io/projected/3a58a085-7205-46f6-b099-d4e3087a0c18-kube-api-access-hvl6l\") pod \"must-gather-k64cm\" (UID: \"3a58a085-7205-46f6-b099-d4e3087a0c18\") " pod="openshift-must-gather-wjw6m/must-gather-k64cm" Apr 23 17:04:57.356858 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.356734 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a58a085-7205-46f6-b099-d4e3087a0c18-must-gather-output\") pod \"must-gather-k64cm\" (UID: \"3a58a085-7205-46f6-b099-d4e3087a0c18\") " pod="openshift-must-gather-wjw6m/must-gather-k64cm" Apr 23 17:04:57.457429 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.457400 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvl6l\" (UniqueName: \"kubernetes.io/projected/3a58a085-7205-46f6-b099-d4e3087a0c18-kube-api-access-hvl6l\") pod \"must-gather-k64cm\" (UID: \"3a58a085-7205-46f6-b099-d4e3087a0c18\") " pod="openshift-must-gather-wjw6m/must-gather-k64cm" Apr 23 17:04:57.457577 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.457455 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a58a085-7205-46f6-b099-d4e3087a0c18-must-gather-output\") pod \"must-gather-k64cm\" (UID: \"3a58a085-7205-46f6-b099-d4e3087a0c18\") " pod="openshift-must-gather-wjw6m/must-gather-k64cm" Apr 23 17:04:57.457826 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.457807 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a58a085-7205-46f6-b099-d4e3087a0c18-must-gather-output\") pod \"must-gather-k64cm\" (UID: \"3a58a085-7205-46f6-b099-d4e3087a0c18\") " pod="openshift-must-gather-wjw6m/must-gather-k64cm" Apr 23 17:04:57.468871 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.468841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvl6l\" (UniqueName: \"kubernetes.io/projected/3a58a085-7205-46f6-b099-d4e3087a0c18-kube-api-access-hvl6l\") pod \"must-gather-k64cm\" (UID: \"3a58a085-7205-46f6-b099-d4e3087a0c18\") " pod="openshift-must-gather-wjw6m/must-gather-k64cm" Apr 23 17:04:57.562124 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.562050 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wjw6m/must-gather-k64cm" Apr 23 17:04:57.706183 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.706157 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wjw6m/must-gather-k64cm"] Apr 23 17:04:57.711556 ip-10-0-136-27 kubenswrapper[2571]: W0423 17:04:57.711514 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a58a085_7205_46f6_b099_d4e3087a0c18.slice/crio-c671170058c0961482cef49af3a6149d2b9a2bb302718e930a885c1bf4c8b754 WatchSource:0}: Error finding container c671170058c0961482cef49af3a6149d2b9a2bb302718e930a885c1bf4c8b754: Status 404 returned error can't find the container with id c671170058c0961482cef49af3a6149d2b9a2bb302718e930a885c1bf4c8b754 Apr 23 17:04:57.803430 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:04:57.803399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wjw6m/must-gather-k64cm" event={"ID":"3a58a085-7205-46f6-b099-d4e3087a0c18","Type":"ContainerStarted","Data":"c671170058c0961482cef49af3a6149d2b9a2bb302718e930a885c1bf4c8b754"} Apr 23 17:05:01.996723 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:01.996673 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:05:02.824209 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:02.824170 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wjw6m/must-gather-k64cm" event={"ID":"3a58a085-7205-46f6-b099-d4e3087a0c18","Type":"ContainerStarted","Data":"9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30"} Apr 23 17:05:02.824209 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:02.824213 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wjw6m/must-gather-k64cm" event={"ID":"3a58a085-7205-46f6-b099-d4e3087a0c18","Type":"ContainerStarted","Data":"5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d"} Apr 23 17:05:02.840681 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:02.840622 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wjw6m/must-gather-k64cm" podStartSLOduration=1.6776053549999999 podStartE2EDuration="5.840606575s" podCreationTimestamp="2026-04-23 17:04:57 +0000 UTC" firstStartedPulling="2026-04-23 17:04:57.713145014 +0000 UTC m=+1780.794587564" lastFinishedPulling="2026-04-23 17:05:01.876146223 +0000 UTC m=+1784.957588784" observedRunningTime="2026-04-23 17:05:02.839027294 +0000 UTC m=+1785.920469873" watchObservedRunningTime="2026-04-23 17:05:02.840606575 +0000 UTC m=+1785.922049145" Apr 23 17:05:11.233539 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:11.233450 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:11.372266 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:11.372231 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:11.383621 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:11.383596 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:12.321349 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:12.321317 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:12.441590 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:12.441565 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:12.471712 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:12.471677 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:13.445487 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:13.445445 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:13.523753 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:13.523725 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:13.532313 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:13.532291 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:14.444873 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:14.444838 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:14.527715 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:14.527677 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:14.535539 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:14.535512 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:15.499440 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:15.499412 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:15.577388 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:15.577339 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:15.584249 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:15.584223 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:16.489388 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:16.489341 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:16.565853 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:16.565820 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:16.572827 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:16.572799 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:17.483197 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:17.483168 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:17.539494 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:17.539466 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 17:05:17.545073 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:17.545049 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 17:05:17.565262 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:17.565231 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:17.572083 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:17.572056 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:18.504620 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:18.504587 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:18.582588 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:18.582553 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:18.590462 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:18.590429 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:19.562140 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:19.562108 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:19.641171 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:19.641141 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:19.650893 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:19.650869 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:20.603197 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:20.603163 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:20.692846 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:20.692809 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:20.699363 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:20.699330 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:21.606875 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:21.606847 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:21.690362 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:21.690330 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:21.697233 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:21.697204 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:22.678570 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:22.678541 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:22.758511 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:22.758469 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:22.765119 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:22.765090 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:23.657862 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:23.657827 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:23.733917 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:23.733887 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:23.741227 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:23.741186 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:24.666495 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:24.666448 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wzxm7_44622e9d-844d-4a1a-b4cb-523054e59ce5/istio-proxy/0.log" Apr 23 17:05:24.747459 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:24.747424 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/main/0.log" Apr 23 17:05:24.754644 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:24.754622 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp_217a393c-a2a8-4637-ab52-c6f0023f77e1/storage-initializer/0.log" Apr 23 17:05:25.765578 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:25.765540 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d_59b49f8a-980e-4e40-a688-5df2c05a7ba3/istio-proxy/0.log" Apr 23 17:05:25.780093 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:25.780071 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5b5c5f8b9d-dcw7k_9a0573ec-1640-4a5f-a88d-42fbaeb67495/router/0.log" Apr 23 17:05:26.439553 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.439528 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:05:26.546001 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.545970 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-kserve-provision-location\") pod \"217a393c-a2a8-4637-ab52-c6f0023f77e1\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " Apr 23 17:05:26.546001 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.546010 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-model-cache\") pod \"217a393c-a2a8-4637-ab52-c6f0023f77e1\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " Apr 23 17:05:26.546266 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.546052 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-dshm\") pod \"217a393c-a2a8-4637-ab52-c6f0023f77e1\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " Apr 23 17:05:26.546266 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.546070 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kc9r\" (UniqueName: \"kubernetes.io/projected/217a393c-a2a8-4637-ab52-c6f0023f77e1-kube-api-access-9kc9r\") pod \"217a393c-a2a8-4637-ab52-c6f0023f77e1\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " Apr 23 17:05:26.546266 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.546182 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-home\") pod \"217a393c-a2a8-4637-ab52-c6f0023f77e1\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " Apr 23 17:05:26.546266 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.546215 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/217a393c-a2a8-4637-ab52-c6f0023f77e1-tls-certs\") pod \"217a393c-a2a8-4637-ab52-c6f0023f77e1\" (UID: \"217a393c-a2a8-4637-ab52-c6f0023f77e1\") " Apr 23 17:05:26.546484 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.546300 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-model-cache" (OuterVolumeSpecName: "model-cache") pod "217a393c-a2a8-4637-ab52-c6f0023f77e1" (UID: "217a393c-a2a8-4637-ab52-c6f0023f77e1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:05:26.546560 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.546539 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-model-cache\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.546731 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.546653 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-home" (OuterVolumeSpecName: "home") pod "217a393c-a2a8-4637-ab52-c6f0023f77e1" (UID: "217a393c-a2a8-4637-ab52-c6f0023f77e1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:05:26.548498 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.548470 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-dshm" (OuterVolumeSpecName: "dshm") pod "217a393c-a2a8-4637-ab52-c6f0023f77e1" (UID: "217a393c-a2a8-4637-ab52-c6f0023f77e1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:05:26.548498 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.548480 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/217a393c-a2a8-4637-ab52-c6f0023f77e1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "217a393c-a2a8-4637-ab52-c6f0023f77e1" (UID: "217a393c-a2a8-4637-ab52-c6f0023f77e1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:05:26.548770 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.548741 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217a393c-a2a8-4637-ab52-c6f0023f77e1-kube-api-access-9kc9r" (OuterVolumeSpecName: "kube-api-access-9kc9r") pod "217a393c-a2a8-4637-ab52-c6f0023f77e1" (UID: "217a393c-a2a8-4637-ab52-c6f0023f77e1"). InnerVolumeSpecName "kube-api-access-9kc9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:05:26.606167 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.606107 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "217a393c-a2a8-4637-ab52-c6f0023f77e1" (UID: "217a393c-a2a8-4637-ab52-c6f0023f77e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:05:26.647819 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.647787 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-dshm\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.647819 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.647817 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9kc9r\" (UniqueName: \"kubernetes.io/projected/217a393c-a2a8-4637-ab52-c6f0023f77e1-kube-api-access-9kc9r\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.648052 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.647828 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-home\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.648052 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.647836 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/217a393c-a2a8-4637-ab52-c6f0023f77e1-tls-certs\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.648052 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.647845 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/217a393c-a2a8-4637-ab52-c6f0023f77e1-kserve-provision-location\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:05:26.655281 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.655254 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d_59b49f8a-980e-4e40-a688-5df2c05a7ba3/istio-proxy/0.log" Apr 23 17:05:26.671641 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.671605 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5b5c5f8b9d-dcw7k_9a0573ec-1640-4a5f-a88d-42fbaeb67495/router/0.log" Apr 23 17:05:26.914171 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.914133 2571 generic.go:358] "Generic (PLEG): container finished" podID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerID="59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66" exitCode=137 Apr 23 17:05:26.914736 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.914243 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" Apr 23 17:05:26.914945 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.914265 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" event={"ID":"217a393c-a2a8-4637-ab52-c6f0023f77e1","Type":"ContainerDied","Data":"59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66"} Apr 23 17:05:26.915082 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.915062 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp" event={"ID":"217a393c-a2a8-4637-ab52-c6f0023f77e1","Type":"ContainerDied","Data":"6652e0d8d68d0df14b9dd3ecc651d6a46851479f7f37401cd6b69433820c8051"} Apr 23 17:05:26.915193 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.915179 2571 scope.go:117] "RemoveContainer" containerID="59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66" Apr 23 17:05:26.939552 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.939519 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp"] Apr 23 17:05:26.942616 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.942598 2571 scope.go:117] "RemoveContainer" containerID="665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011" Apr 23 17:05:26.943022 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:26.942995 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7bf8c6f89f-vsrcp"] Apr 23 17:05:27.023011 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:27.022980 2571 scope.go:117] "RemoveContainer" containerID="59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66" Apr 23 17:05:27.023443 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:05:27.023410 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66\": container with ID starting with 59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66 not found: ID does not exist" containerID="59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66" Apr 23 17:05:27.023564 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:27.023458 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66"} err="failed to get container status \"59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66\": rpc error: code = NotFound desc = could not find container \"59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66\": container with ID starting with 59adf7afcf34c4d460624f8e5d4c94a8d9ba4158d9c91a4ac5d09944d8d1ea66 not found: ID does not exist" Apr 23 17:05:27.023564 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:27.023520 2571 scope.go:117] "RemoveContainer" containerID="665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011" Apr 23 17:05:27.023971 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:05:27.023937 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011\": container with ID starting with 665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011 not found: ID does not exist" containerID="665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011" Apr 23 17:05:27.024066 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:27.023981 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011"} err="failed to get container status \"665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011\": rpc error: code = NotFound desc = could not find container \"665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011\": container with ID starting with 665acc176ec5402913b1f5cdcf7d2c25a8fe3312c1bf5563b6b7c01eaa78e011 not found: ID does not exist" Apr 23 17:05:27.448242 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:27.448200 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-2lhrl_31965c06-acb6-474c-8ce9-ca34c0c2f2f3/authorino/0.log" Apr 23 17:05:27.511652 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:27.511621 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" path="/var/lib/kubelet/pods/217a393c-a2a8-4637-ab52-c6f0023f77e1/volumes" Apr 23 17:05:27.550394 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:27.550363 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-4x4v6_43ae2150-33ea-4e3b-9b53-b4f92e8c2879/manager/0.log" Apr 23 17:05:28.924162 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:28.924122 2571 generic.go:358] "Generic (PLEG): container finished" podID="3a58a085-7205-46f6-b099-d4e3087a0c18" containerID="5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d" exitCode=0 Apr 23 17:05:28.924570 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:28.924198 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wjw6m/must-gather-k64cm" event={"ID":"3a58a085-7205-46f6-b099-d4e3087a0c18","Type":"ContainerDied","Data":"5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d"} Apr 23 17:05:28.924612 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:28.924575 2571 scope.go:117] "RemoveContainer" containerID="5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d" Apr 23 17:05:29.192832 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:29.192805 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wjw6m_must-gather-k64cm_3a58a085-7205-46f6-b099-d4e3087a0c18/gather/0.log" Apr 23 17:05:32.766059 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:32.766027 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gnw5g_47cef5f9-f168-4bee-ad96-bcf87f6d22e1/global-pull-secret-syncer/0.log" Apr 23 17:05:32.867055 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:32.867023 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jdgl4_00650519-0327-483d-83cf-59b7e20fd1f5/konnectivity-agent/0.log" Apr 23 17:05:32.953181 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:32.953154 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-27.ec2.internal_6402b5e4dc46963653aa05278c9bac43/haproxy/0.log" Apr 23 17:05:34.688327 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.688294 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wjw6m/must-gather-k64cm"] Apr 23 17:05:34.688818 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.688519 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-wjw6m/must-gather-k64cm" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" containerName="copy" containerID="cri-o://9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30" gracePeriod=2 Apr 23 17:05:34.691156 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.691102 2571 status_manager.go:895] "Failed to get status for pod" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" pod="openshift-must-gather-wjw6m/must-gather-k64cm" err="pods \"must-gather-k64cm\" is forbidden: User \"system:node:ip-10-0-136-27.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wjw6m\": no relationship found between node 'ip-10-0-136-27.ec2.internal' and this object" Apr 23 17:05:34.692748 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.692727 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wjw6m/must-gather-k64cm"] Apr 23 17:05:34.916551 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.916526 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wjw6m_must-gather-k64cm_3a58a085-7205-46f6-b099-d4e3087a0c18/copy/0.log" Apr 23 17:05:34.916857 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.916844 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wjw6m/must-gather-k64cm" Apr 23 17:05:34.920011 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.919983 2571 status_manager.go:895] "Failed to get status for pod" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" pod="openshift-must-gather-wjw6m/must-gather-k64cm" err="pods \"must-gather-k64cm\" is forbidden: User \"system:node:ip-10-0-136-27.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wjw6m\": no relationship found between node 'ip-10-0-136-27.ec2.internal' and this object" Apr 23 17:05:34.949228 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.949210 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wjw6m_must-gather-k64cm_3a58a085-7205-46f6-b099-d4e3087a0c18/copy/0.log" Apr 23 17:05:34.949533 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.949511 2571 generic.go:358] "Generic (PLEG): container finished" podID="3a58a085-7205-46f6-b099-d4e3087a0c18" containerID="9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30" exitCode=143 Apr 23 17:05:34.949619 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.949585 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wjw6m/must-gather-k64cm" Apr 23 17:05:34.949619 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.949613 2571 scope.go:117] "RemoveContainer" containerID="9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30" Apr 23 17:05:34.952594 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.952570 2571 status_manager.go:895] "Failed to get status for pod" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" pod="openshift-must-gather-wjw6m/must-gather-k64cm" err="pods \"must-gather-k64cm\" is forbidden: User \"system:node:ip-10-0-136-27.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wjw6m\": no relationship found between node 'ip-10-0-136-27.ec2.internal' and this object" Apr 23 17:05:34.957397 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.957379 2571 scope.go:117] "RemoveContainer" containerID="5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d" Apr 23 17:05:34.970952 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.970932 2571 scope.go:117] "RemoveContainer" containerID="9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30" Apr 23 17:05:34.971208 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:05:34.971190 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30\": container with ID starting with 9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30 not found: ID does not exist" containerID="9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30" Apr 23 17:05:34.971257 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.971219 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30"} err="failed to get container status \"9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30\": rpc error: code = NotFound desc = could not find container \"9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30\": container with ID starting with 9e015ec8c44bbdc757508d6684e31245b7b767f617100d0aaf4adeccc6f66d30 not found: ID does not exist" Apr 23 17:05:34.971257 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.971240 2571 scope.go:117] "RemoveContainer" containerID="5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d" Apr 23 17:05:34.971471 ip-10-0-136-27 kubenswrapper[2571]: E0423 17:05:34.971454 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d\": container with ID starting with 5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d not found: ID does not exist" containerID="5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d" Apr 23 17:05:34.971509 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:34.971476 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d"} err="failed to get container status \"5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d\": rpc error: code = NotFound desc = could not find container \"5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d\": container with ID starting with 5565c8d5445476034a3613e79afdc8343112cb5c4986d7ae78c2e5dccb22a97d not found: ID does not exist" Apr 23 17:05:35.030003 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:35.029969 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvl6l\" (UniqueName: \"kubernetes.io/projected/3a58a085-7205-46f6-b099-d4e3087a0c18-kube-api-access-hvl6l\") pod \"3a58a085-7205-46f6-b099-d4e3087a0c18\" (UID: \"3a58a085-7205-46f6-b099-d4e3087a0c18\") " Apr 23 17:05:35.030159 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:35.030041 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a58a085-7205-46f6-b099-d4e3087a0c18-must-gather-output\") pod \"3a58a085-7205-46f6-b099-d4e3087a0c18\" (UID: \"3a58a085-7205-46f6-b099-d4e3087a0c18\") " Apr 23 17:05:35.032372 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:35.032332 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a58a085-7205-46f6-b099-d4e3087a0c18-kube-api-access-hvl6l" (OuterVolumeSpecName: "kube-api-access-hvl6l") pod "3a58a085-7205-46f6-b099-d4e3087a0c18" (UID: "3a58a085-7205-46f6-b099-d4e3087a0c18"). InnerVolumeSpecName "kube-api-access-hvl6l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:05:35.035773 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:35.035748 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a58a085-7205-46f6-b099-d4e3087a0c18-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3a58a085-7205-46f6-b099-d4e3087a0c18" (UID: "3a58a085-7205-46f6-b099-d4e3087a0c18"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:05:35.131354 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:35.131325 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hvl6l\" (UniqueName: \"kubernetes.io/projected/3a58a085-7205-46f6-b099-d4e3087a0c18-kube-api-access-hvl6l\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:05:35.131354 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:35.131351 2571 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a58a085-7205-46f6-b099-d4e3087a0c18-must-gather-output\") on node \"ip-10-0-136-27.ec2.internal\" DevicePath \"\"" Apr 23 17:05:35.259454 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:35.259423 2571 status_manager.go:895] "Failed to get status for pod" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" pod="openshift-must-gather-wjw6m/must-gather-k64cm" err="pods \"must-gather-k64cm\" is forbidden: User \"system:node:ip-10-0-136-27.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wjw6m\": no relationship found between node 'ip-10-0-136-27.ec2.internal' and this object" Apr 23 17:05:35.513133 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:35.513034 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" path="/var/lib/kubelet/pods/3a58a085-7205-46f6-b099-d4e3087a0c18/volumes" Apr 23 17:05:36.059452 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:36.059419 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-2lhrl_31965c06-acb6-474c-8ce9-ca34c0c2f2f3/authorino/0.log" Apr 23 17:05:36.237651 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:36.237618 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-4x4v6_43ae2150-33ea-4e3b-9b53-b4f92e8c2879/manager/0.log" Apr 23 17:05:37.696736 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.696684 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1b312748-82f4-4085-bc72-4d5fc5694a9b/alertmanager/0.log" Apr 23 17:05:37.719208 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.719179 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1b312748-82f4-4085-bc72-4d5fc5694a9b/config-reloader/0.log" Apr 23 17:05:37.740978 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.740943 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1b312748-82f4-4085-bc72-4d5fc5694a9b/kube-rbac-proxy-web/0.log" Apr 23 17:05:37.761681 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.761654 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1b312748-82f4-4085-bc72-4d5fc5694a9b/kube-rbac-proxy/0.log" Apr 23 17:05:37.782855 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.782822 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1b312748-82f4-4085-bc72-4d5fc5694a9b/kube-rbac-proxy-metric/0.log" Apr 23 17:05:37.804112 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.804090 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1b312748-82f4-4085-bc72-4d5fc5694a9b/prom-label-proxy/0.log" Apr 23 17:05:37.826606 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.826575 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1b312748-82f4-4085-bc72-4d5fc5694a9b/init-config-reloader/0.log" Apr 23 17:05:37.869228 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.869188 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-gmq6g_4984906f-2911-43eb-982b-401a7c9fbc32/cluster-monitoring-operator/0.log" Apr 23 17:05:37.893858 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.893830 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xzd44_2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7/kube-state-metrics/0.log" Apr 23 17:05:37.915371 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.915343 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xzd44_2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7/kube-rbac-proxy-main/0.log" Apr 23 17:05:37.935187 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.935159 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xzd44_2eb8a4dd-b6bb-4cbb-849e-6fb6c06857b7/kube-rbac-proxy-self/0.log" Apr 23 17:05:37.963034 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:37.962959 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5b568c66f8-rk9gr_a7584369-a97b-4fb0-9628-0b8f04bb1761/metrics-server/0.log" Apr 23 17:05:38.019517 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:38.019486 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9kztc_7a0e4d71-73b2-469c-a183-6f2dd1c34d66/node-exporter/0.log" Apr 23 17:05:38.038263 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:38.038235 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9kztc_7a0e4d71-73b2-469c-a183-6f2dd1c34d66/kube-rbac-proxy/0.log" Apr 23 17:05:38.057241 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:38.057210 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9kztc_7a0e4d71-73b2-469c-a183-6f2dd1c34d66/init-textfile/0.log" Apr 23 17:05:38.409756 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:38.409651 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-br4xh_65408c14-ea04-45c3-8378-2138509045ff/prometheus-operator/0.log" Apr 23 17:05:38.428112 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:38.428081 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-br4xh_65408c14-ea04-45c3-8378-2138509045ff/kube-rbac-proxy/0.log" Apr 23 17:05:38.480832 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:38.480804 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b6f8cdb5f-ng72m_f57aef30-fe57-482d-ad55-41be2b64efec/telemeter-client/0.log" Apr 23 17:05:38.530133 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:38.530102 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b6f8cdb5f-ng72m_f57aef30-fe57-482d-ad55-41be2b64efec/reload/0.log" Apr 23 17:05:38.566344 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:38.566316 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b6f8cdb5f-ng72m_f57aef30-fe57-482d-ad55-41be2b64efec/kube-rbac-proxy/0.log" Apr 23 17:05:40.276660 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:40.276624 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-7fsct_bb5d4845-68ad-41c5-bc80-0d399b962c20/networking-console-plugin/0.log" Apr 23 17:05:40.839605 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:40.839569 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/1.log" Apr 23 17:05:40.848153 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:40.848129 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8sjkd_036b63b0-d570-44cc-b606-bb46f38e6753/console-operator/2.log" Apr 23 17:05:41.707674 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.707635 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8"] Apr 23 17:05:41.708190 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708175 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" Apr 23 17:05:41.708262 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708196 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" Apr 23 17:05:41.708262 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708208 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" containerName="gather" Apr 23 17:05:41.708262 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708216 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" containerName="gather" Apr 23 17:05:41.708262 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708238 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="storage-initializer" Apr 23 17:05:41.708262 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708246 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="storage-initializer" Apr 23 17:05:41.708262 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708256 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" containerName="copy" Apr 23 17:05:41.708555 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708264 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" containerName="copy" Apr 23 17:05:41.708555 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708363 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" containerName="copy" Apr 23 17:05:41.708555 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708378 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a58a085-7205-46f6-b099-d4e3087a0c18" containerName="gather" Apr 23 17:05:41.708555 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.708394 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="217a393c-a2a8-4637-ab52-c6f0023f77e1" containerName="main" Apr 23 17:05:41.715011 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.714989 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.718099 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.718052 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fptc\"/\"openshift-service-ca.crt\"" Apr 23 17:05:41.718261 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.718131 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fptc\"/\"kube-root-ca.crt\"" Apr 23 17:05:41.719306 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.719288 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2fptc\"/\"default-dockercfg-wnz7f\"" Apr 23 17:05:41.720015 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.719993 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8"] Apr 23 17:05:41.793441 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.793399 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-podres\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.793627 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.793452 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-lib-modules\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.793627 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.793507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-sys\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.793763 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.793654 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-proc\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.793807 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.793785 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f69m9\" (UniqueName: \"kubernetes.io/projected/fb121160-cd4f-41dc-b82d-da3bbfbfee62-kube-api-access-f69m9\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.868505 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.868451 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-v4lvm_d2276480-4d87-4114-b668-458483b817d5/volume-data-source-validator/0.log" Apr 23 17:05:41.894931 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.894904 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-proc\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.895095 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.894961 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f69m9\" (UniqueName: \"kubernetes.io/projected/fb121160-cd4f-41dc-b82d-da3bbfbfee62-kube-api-access-f69m9\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.895095 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.895000 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-podres\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.895095 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.895021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-lib-modules\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.895095 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.895022 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-proc\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.895095 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.895037 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-sys\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.895302 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.895125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-sys\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.895302 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.895138 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-lib-modules\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.895302 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.895133 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fb121160-cd4f-41dc-b82d-da3bbfbfee62-podres\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:41.907128 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:41.907098 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f69m9\" (UniqueName: \"kubernetes.io/projected/fb121160-cd4f-41dc-b82d-da3bbfbfee62-kube-api-access-f69m9\") pod \"perf-node-gather-daemonset-zmzr8\" (UID: \"fb121160-cd4f-41dc-b82d-da3bbfbfee62\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:42.025652 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:42.025558 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:42.150076 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:42.150051 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8"] Apr 23 17:05:42.152645 ip-10-0-136-27 kubenswrapper[2571]: W0423 17:05:42.152603 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfb121160_cd4f_41dc_b82d_da3bbfbfee62.slice/crio-bcf81d017a2edefec75bf20340384f1e685bf4ebf5abb8deb2ae0fea48857315 WatchSource:0}: Error finding container bcf81d017a2edefec75bf20340384f1e685bf4ebf5abb8deb2ae0fea48857315: Status 404 returned error can't find the container with id bcf81d017a2edefec75bf20340384f1e685bf4ebf5abb8deb2ae0fea48857315 Apr 23 17:05:42.626347 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:42.626307 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mfgns_1026d702-16e3-45e1-821e-0f0a702f27d3/dns/0.log" Apr 23 17:05:42.644871 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:42.644834 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mfgns_1026d702-16e3-45e1-821e-0f0a702f27d3/kube-rbac-proxy/0.log" Apr 23 17:05:42.665345 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:42.665314 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6stz5_33e0f7f2-93d8-459f-9c61-240a8cdad803/dns-node-resolver/0.log" Apr 23 17:05:42.980271 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:42.980236 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" event={"ID":"fb121160-cd4f-41dc-b82d-da3bbfbfee62","Type":"ContainerStarted","Data":"eba97313865accab67010cb2183a412672b001734cd339c6baaa759e6b1c66a4"} Apr 23 17:05:42.980659 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:42.980278 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" event={"ID":"fb121160-cd4f-41dc-b82d-da3bbfbfee62","Type":"ContainerStarted","Data":"bcf81d017a2edefec75bf20340384f1e685bf4ebf5abb8deb2ae0fea48857315"} Apr 23 17:05:42.980659 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:42.980308 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:42.998811 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:42.998763 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" podStartSLOduration=1.998747966 podStartE2EDuration="1.998747966s" podCreationTimestamp="2026-04-23 17:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:05:42.997286579 +0000 UTC m=+1826.078729148" watchObservedRunningTime="2026-04-23 17:05:42.998747966 +0000 UTC m=+1826.080190534" Apr 23 17:05:43.153798 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:43.153767 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d2k5z_7174271c-a85b-4c6d-872b-f2b384da443b/node-ca/0.log" Apr 23 17:05:44.017272 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:44.017245 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-zhr4d_59b49f8a-980e-4e40-a688-5df2c05a7ba3/istio-proxy/0.log" Apr 23 17:05:44.039286 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:44.039255 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5b5c5f8b9d-dcw7k_9a0573ec-1640-4a5f-a88d-42fbaeb67495/router/0.log" Apr 23 17:05:44.469204 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:44.469153 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bbhgp_ec4a3bc4-5872-47af-b0d2-34143e0f2dea/serve-healthcheck-canary/0.log" Apr 23 17:05:44.904507 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:44.904414 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-d97rf_b909c180-8bf1-44a4-8a87-c6d4756c787a/insights-operator/0.log" Apr 23 17:05:44.906745 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:44.906716 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-d97rf_b909c180-8bf1-44a4-8a87-c6d4756c787a/insights-operator/1.log" Apr 23 17:05:45.030390 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:45.030355 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z2gnb_cf289e76-9f00-4d50-b667-8c9cee95e651/kube-rbac-proxy/0.log" Apr 23 17:05:45.048732 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:45.048618 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z2gnb_cf289e76-9f00-4d50-b667-8c9cee95e651/exporter/0.log" Apr 23 17:05:45.068283 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:45.068254 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z2gnb_cf289e76-9f00-4d50-b667-8c9cee95e651/extractor/0.log" Apr 23 17:05:47.609027 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:47.608996 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-b454c4fb-vqlsf_3b9f39e9-424e-413a-be6b-cbccba14148c/manager/0.log" Apr 23 17:05:48.994028 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:48.993997 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-zmzr8" Apr 23 17:05:52.865515 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:52.865485 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zndmd_b89977fa-d779-4ac7-93ec-ff738b252b10/migrator/0.log" Apr 23 17:05:52.882959 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:52.882934 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zndmd_b89977fa-d779-4ac7-93ec-ff738b252b10/graceful-termination/0.log" Apr 23 17:05:54.306834 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:54.306806 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7tsv_31d4c6c2-69cd-4240-b617-6cc884b17481/kube-multus-additional-cni-plugins/0.log" Apr 23 17:05:54.328193 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:54.328159 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7tsv_31d4c6c2-69cd-4240-b617-6cc884b17481/egress-router-binary-copy/0.log" Apr 23 17:05:54.346748 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:54.346716 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7tsv_31d4c6c2-69cd-4240-b617-6cc884b17481/cni-plugins/0.log" Apr 23 17:05:54.364329 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:54.364304 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7tsv_31d4c6c2-69cd-4240-b617-6cc884b17481/bond-cni-plugin/0.log" Apr 23 17:05:54.381564 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:54.381537 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7tsv_31d4c6c2-69cd-4240-b617-6cc884b17481/routeoverride-cni/0.log" Apr 23 17:05:54.399278 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:54.399246 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7tsv_31d4c6c2-69cd-4240-b617-6cc884b17481/whereabouts-cni-bincopy/0.log" Apr 23 17:05:54.418764 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:54.418738 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7tsv_31d4c6c2-69cd-4240-b617-6cc884b17481/whereabouts-cni/0.log" Apr 23 17:05:54.594747 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:54.594650 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qgbqb_9fbf01c4-8974-4a69-881e-b57e55f7b1f1/kube-multus/0.log" Apr 23 17:05:54.724828 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:54.724799 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hsxbc_9f25a094-e342-4690-8028-f1a3ddd77829/network-metrics-daemon/0.log" Apr 23 17:05:54.740883 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:54.740848 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hsxbc_9f25a094-e342-4690-8028-f1a3ddd77829/kube-rbac-proxy/0.log" Apr 23 17:05:55.524065 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:55.524030 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jbv88_9c4113f8-445a-41a3-afe0-4d920d77c9c9/ovn-controller/0.log" Apr 23 17:05:55.555131 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:55.555097 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jbv88_9c4113f8-445a-41a3-afe0-4d920d77c9c9/ovn-acl-logging/0.log" Apr 23 17:05:55.576584 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:55.576555 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jbv88_9c4113f8-445a-41a3-afe0-4d920d77c9c9/kube-rbac-proxy-node/0.log" Apr 23 17:05:55.597394 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:55.597366 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jbv88_9c4113f8-445a-41a3-afe0-4d920d77c9c9/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:05:55.612944 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:55.612920 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jbv88_9c4113f8-445a-41a3-afe0-4d920d77c9c9/northd/0.log" Apr 23 17:05:55.630811 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:55.630777 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jbv88_9c4113f8-445a-41a3-afe0-4d920d77c9c9/nbdb/0.log" Apr 23 17:05:55.648856 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:55.648831 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jbv88_9c4113f8-445a-41a3-afe0-4d920d77c9c9/sbdb/0.log" Apr 23 17:05:55.830436 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:55.830360 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jbv88_9c4113f8-445a-41a3-afe0-4d920d77c9c9/ovnkube-controller/0.log" Apr 23 17:05:57.518296 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:57.518267 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-6mngq_5c592392-1c6a-4812-86ed-e1ed2f002ce0/check-endpoints/0.log" Apr 23 17:05:57.578765 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:57.578732 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vrntx_7c952840-e4f8-4b49-9f90-0d7aa2618091/network-check-target-container/0.log" Apr 23 17:05:58.585424 ip-10-0-136-27 kubenswrapper[2571]: I0423 17:05:58.585394 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tft7d_9c427934-9049-45a2-bbd8-6cb89f9149e2/iptables-alerter/0.log"