Apr 17 07:51:36.087986 ip-10-0-128-217 systemd[1]: Starting Kubernetes Kubelet... Apr 17 07:51:36.568382 ip-10-0-128-217 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:36.568382 ip-10-0-128-217 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 07:51:36.568382 ip-10-0-128-217 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:36.568382 ip-10-0-128-217 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 07:51:36.568382 ip-10-0-128-217 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:36.570488 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.570397 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 07:51:36.573468 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573451 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:36.573468 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573468 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573472 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573476 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573479 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573482 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573485 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573488 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573492 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573495 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573498 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573501 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573504 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573507 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573509 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573512 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573515 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573517 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573520 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573523 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:36.573532 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573525 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573528 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573530 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573533 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573535 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573538 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573541 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573544 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573547 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573549 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573552 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573554 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573557 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573560 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573562 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573565 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573568 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573570 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573573 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573575 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:36.574036 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573578 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573580 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573582 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573585 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573589 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573591 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573594 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573596 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573599 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573603 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573606 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573609 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573612 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573615 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573620 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573623 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573626 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573629 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573632 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:36.574533 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573635 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573637 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573640 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573643 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573645 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573648 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573651 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573654 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573656 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573659 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573661 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573664 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573668 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573672 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573675 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573678 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573680 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573683 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573685 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573687 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:36.575019 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573690 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573705 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573708 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573711 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573714 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573717 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.573720 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574104 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574108 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574111 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574114 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574117 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574119 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574122 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574125 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574127 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574130 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574133 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574136 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:36.575500 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574138 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574141 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574143 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574146 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574148 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574151 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574153 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574156 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574159 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574161 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574163 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574167 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574169 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574172 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574174 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574177 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574179 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574182 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574184 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574187 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:36.575965 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574190 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574193 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574196 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574198 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574200 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574203 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574206 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574208 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574211 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574214 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574216 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574219 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574222 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574224 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574227 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574229 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574233 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574236 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574239 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:36.576511 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574241 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574245 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574247 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574250 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574253 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574255 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574258 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574260 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574263 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574265 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574268 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574270 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574273 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574276 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574278 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574281 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574283 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574286 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574289 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574291 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:36.576996 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574293 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574296 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574298 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574301 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574303 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574306 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574309 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574311 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574314 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574316 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574319 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574321 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574324 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574338 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:36.577489 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.574342 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577666 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577677 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577683 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577688 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577704 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577708 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577712 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577717 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577720 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577723 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577727 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577730 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577733 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577736 2566 flags.go:64] FLAG: --cgroup-root="" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577739 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577742 2566 flags.go:64] FLAG: --client-ca-file="" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577745 2566 flags.go:64] FLAG: --cloud-config="" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577748 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577751 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577756 2566 flags.go:64] FLAG: --cluster-domain="" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577759 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577762 2566 flags.go:64] FLAG: --config-dir="" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577765 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577769 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 07:51:36.577847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577773 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577777 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577780 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577783 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577786 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577788 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577792 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577795 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577798 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577803 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577807 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577810 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577814 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577817 2566 flags.go:64] FLAG: --enable-server="true" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577820 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577825 2566 flags.go:64] FLAG: --event-burst="100" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577828 2566 flags.go:64] FLAG: --event-qps="50" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577831 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577834 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577837 2566 flags.go:64] FLAG: --eviction-hard="" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577841 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577844 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577847 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577850 2566 flags.go:64] FLAG: --eviction-soft="" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577854 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 07:51:36.578453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577857 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577860 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577863 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577866 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577869 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577872 2566 flags.go:64] FLAG: --feature-gates="" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577876 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577881 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577884 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577887 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577891 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577894 2566 flags.go:64] FLAG: --help="false" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577897 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-128-217.ec2.internal" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577900 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577903 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577906 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577910 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577913 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577916 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577919 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577922 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577925 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577928 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577931 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 07:51:36.579072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577934 2566 flags.go:64] FLAG: --kube-reserved="" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577937 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577940 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577943 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577946 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577949 2566 flags.go:64] FLAG: --lock-file="" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577951 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577954 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577957 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577963 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577966 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577968 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577971 2566 flags.go:64] FLAG: --logging-format="text" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577974 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577978 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577980 2566 flags.go:64] FLAG: --manifest-url="" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577984 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577988 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577991 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577995 2566 flags.go:64] FLAG: --max-pods="110" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.577999 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578002 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578005 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578008 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578011 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 07:51:36.579653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578014 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578017 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578030 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578033 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578036 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578039 2566 flags.go:64] FLAG: --pod-cidr="" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578042 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578048 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578051 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578055 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578058 2566 flags.go:64] FLAG: --port="10250" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578061 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578063 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f06b5172cca62722" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578067 2566 flags.go:64] FLAG: --qos-reserved="" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578070 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578073 2566 flags.go:64] FLAG: --register-node="true" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578076 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578078 2566 flags.go:64] FLAG: --register-with-taints="" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578082 2566 flags.go:64] FLAG: --registry-burst="10" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578085 2566 flags.go:64] FLAG: --registry-qps="5" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578088 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578091 2566 flags.go:64] FLAG: --reserved-memory="" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578095 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578098 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578101 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 07:51:36.580277 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578104 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578107 2566 flags.go:64] FLAG: --runonce="false" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578110 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578113 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578116 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578119 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578122 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578125 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578128 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578131 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578134 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578137 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578140 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578143 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578146 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578149 2566 flags.go:64] FLAG: --system-cgroups="" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578152 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578157 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578160 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578163 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578167 2566 flags.go:64] FLAG: --tls-min-version="" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578170 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578173 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578176 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578179 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 07:51:36.580888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578182 2566 flags.go:64] FLAG: --v="2" Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578188 2566 flags.go:64] FLAG: --version="false" Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578192 2566 flags.go:64] FLAG: --vmodule="" Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578197 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.578200 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578297 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578301 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578305 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578307 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578310 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578313 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578315 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578318 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578321 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578323 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578326 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578328 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578331 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578333 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578337 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578339 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:36.581490 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578342 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578345 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578347 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578350 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578353 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578357 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578360 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578363 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578365 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578368 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578371 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578374 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578378 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578381 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578384 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578386 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578389 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578392 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578394 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578397 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:36.582085 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578400 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578404 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578408 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578411 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578414 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578417 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578420 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578423 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578425 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578428 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578431 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578433 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578436 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578438 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578441 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578443 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578446 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578451 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578453 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:36.582584 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578456 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578459 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578461 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578464 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578467 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578471 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578474 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578476 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578479 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578482 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578484 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578487 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578489 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578493 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578496 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578499 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578501 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578503 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578506 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578508 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:36.583067 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578511 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578514 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578516 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578519 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578521 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578524 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578526 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578530 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578534 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578537 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.578541 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:36.583561 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.579748 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:36.587643 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.587516 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 07:51:36.587643 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.587640 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587691 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587715 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587720 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587724 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587728 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587731 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587734 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587737 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587740 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587742 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587745 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587748 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587750 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587753 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587755 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587758 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587761 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587764 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587766 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:36.587796 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587769 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587771 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587774 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587777 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587779 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587783 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587785 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587788 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587790 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587793 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587796 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587798 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587801 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587804 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587807 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587810 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587813 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587815 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587818 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587820 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:36.588301 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587823 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587825 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587828 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587831 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587833 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587837 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587840 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587843 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587846 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587848 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587851 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587853 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587856 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587859 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587863 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587866 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587869 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587871 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587874 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:36.588896 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587877 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587880 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587882 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587885 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587888 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587890 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587893 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587895 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587898 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587901 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587903 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587906 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587908 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587911 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587913 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587916 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587918 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587921 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587924 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587927 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:36.589401 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587930 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587934 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587939 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587942 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587945 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587947 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587950 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.587954 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.587960 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588065 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588071 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588074 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588077 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588080 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588083 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:36.589903 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588086 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588089 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588092 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588095 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588097 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588100 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588102 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588105 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588107 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588110 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588113 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588117 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588120 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588123 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588126 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588128 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588132 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588143 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588147 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588150 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:36.590284 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588153 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588155 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588158 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588161 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588163 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588167 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588169 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588172 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588175 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588178 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588180 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588183 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588186 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588190 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588193 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588197 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588200 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588202 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588205 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:36.590852 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588208 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588210 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588213 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588215 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588218 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588220 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588223 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588225 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588228 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588230 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588234 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588236 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588239 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588241 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588244 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588246 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588249 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588252 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588254 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588257 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:36.591330 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588260 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588262 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588265 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588267 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588270 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588273 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588275 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588278 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588280 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588283 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588285 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588288 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588290 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588293 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588295 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588298 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588301 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588303 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588306 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:36.591824 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588309 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:36.592294 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:36.588312 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:36.592294 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.588317 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:36.592294 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.588432 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 07:51:36.594169 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.594155 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 07:51:36.595116 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.595105 2566 server.go:1019] "Starting client certificate rotation" Apr 17 07:51:36.595215 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.595197 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:36.595245 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.595233 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:36.624389 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.624367 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:36.629004 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.628981 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:36.649356 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.649333 2566 log.go:25] "Validated CRI v1 runtime API" Apr 17 07:51:36.657098 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.657078 2566 log.go:25] "Validated CRI v1 image API" Apr 17 07:51:36.659367 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.659348 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 07:51:36.663393 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.663371 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:36.664750 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.664727 2566 fs.go:135] Filesystem UUIDs: map[405ffd5f-2e1a-4486-8d4b-1be984410602:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 be09341d-26bc-4c98-9474-706affe5da2f:/dev/nvme0n1p3] Apr 17 07:51:36.664831 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.664747 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 07:51:36.670912 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.670809 2566 manager.go:217] Machine: {Timestamp:2026-04-17 07:51:36.66854091 +0000 UTC m=+0.456319504 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099922 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec269fda35c7dc6817043afc24dc61a8 SystemUUID:ec269fda-35c7-dc68-1704-3afc24dc61a8 BootID:5596a7c2-3900-49df-972d-5e3993e7e5e0 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:75:fe:0f:5b:0b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:75:fe:0f:5b:0b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:a0:6b:48:a3:80 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 07:51:36.670912 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.670906 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 07:51:36.671063 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.671034 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 07:51:36.672149 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.672116 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 07:51:36.672284 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.672151 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-217.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 07:51:36.672333 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.672293 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 07:51:36.672333 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.672301 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 07:51:36.672333 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.672314 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:36.672333 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.672330 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:36.673764 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.673754 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:36.673866 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.673857 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 07:51:36.676295 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.676285 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 17 07:51:36.676336 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.676299 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 07:51:36.677092 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.677082 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 07:51:36.677125 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.677098 2566 kubelet.go:397] "Adding apiserver pod source" Apr 17 07:51:36.677125 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.677107 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 07:51:36.678293 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.678277 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:36.678371 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.678304 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:36.681432 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.681414 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 07:51:36.682803 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.682790 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 07:51:36.684464 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684448 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 07:51:36.684464 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684465 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 07:51:36.684558 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684472 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 07:51:36.684558 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684477 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 07:51:36.684558 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684484 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 07:51:36.684558 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684490 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 07:51:36.684558 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684496 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 07:51:36.684558 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684502 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 07:51:36.684558 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684510 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 07:51:36.684558 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684515 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 07:51:36.684558 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684524 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 07:51:36.684558 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.684533 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 07:51:36.688447 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.688431 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 07:51:36.688524 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.688450 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 07:51:36.688629 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.688609 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-217.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 07:51:36.688629 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.688609 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 07:51:36.692128 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.692116 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 07:51:36.692197 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.692163 2566 server.go:1295] "Started kubelet" Apr 17 07:51:36.692301 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.692253 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 07:51:36.692766 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.692727 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 07:51:36.692811 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.692788 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 07:51:36.693276 ip-10-0-128-217 systemd[1]: Started Kubernetes Kubelet. Apr 17 07:51:36.693922 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.693904 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 07:51:36.695607 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.695592 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 17 07:51:36.698203 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.698186 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-217.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 07:51:36.699317 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.699298 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:36.699409 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.698191 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-217.ec2.internal.18a715926ef85a4f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-217.ec2.internal,UID:ip-10-0-128-217.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-217.ec2.internal,},FirstTimestamp:2026-04-17 07:51:36.692128335 +0000 UTC m=+0.479906929,LastTimestamp:2026-04-17 07:51:36.692128335 +0000 UTC m=+0.479906929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-217.ec2.internal,}" Apr 17 07:51:36.699869 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.699849 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 07:51:36.701773 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.701625 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 07:51:36.701894 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.701884 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 07:51:36.702094 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.701791 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:36.702188 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.702148 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 17 07:51:36.702188 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.702157 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 17 07:51:36.702402 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.702388 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 07:51:36.702738 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.702711 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vqsck" Apr 17 07:51:36.703111 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.703075 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 07:51:36.703401 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.703380 2566 factory.go:55] Registering systemd factory Apr 17 07:51:36.703488 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.703418 2566 factory.go:223] Registration of the systemd container factory successfully Apr 17 07:51:36.703712 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.703684 2566 factory.go:153] Registering CRI-O factory Apr 17 07:51:36.703767 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.703716 2566 factory.go:223] Registration of the crio container factory successfully Apr 17 07:51:36.703822 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.703803 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 07:51:36.703867 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.703838 2566 factory.go:103] Registering Raw factory Apr 17 07:51:36.703867 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.703856 2566 manager.go:1196] Started watching for new ooms in manager Apr 17 07:51:36.705118 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.705095 2566 manager.go:319] Starting recovery of all containers Apr 17 07:51:36.709175 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.709152 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vqsck" Apr 17 07:51:36.709271 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.709245 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-217.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 07:51:36.709431 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.709407 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 07:51:36.715407 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.715390 2566 manager.go:324] Recovery completed Apr 17 07:51:36.719549 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.719536 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:36.723768 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.723751 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:36.723840 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.723782 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:36.723840 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.723793 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:36.724275 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.724260 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 07:51:36.724275 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.724273 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 07:51:36.724367 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.724288 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:36.727060 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.727048 2566 policy_none.go:49] "None policy: Start" Apr 17 07:51:36.727115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.727064 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 07:51:36.727115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.727074 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 17 07:51:36.762264 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.762242 2566 manager.go:341] "Starting Device Plugin manager" Apr 17 07:51:36.773616 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.762301 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 07:51:36.773616 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.762315 2566 server.go:85] "Starting device plugin registration server" Apr 17 07:51:36.773616 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.762577 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 07:51:36.773616 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.762594 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 07:51:36.773616 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.762677 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 07:51:36.773616 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.762776 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 07:51:36.773616 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.762788 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 07:51:36.773616 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.763531 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 07:51:36.773616 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.763568 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:36.801172 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.801138 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 07:51:36.802391 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.802367 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 07:51:36.802486 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.802401 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 07:51:36.802486 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.802422 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 07:51:36.802486 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.802432 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 07:51:36.802640 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.802528 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 07:51:36.807019 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.807002 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:36.862817 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.862745 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:36.865255 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.865237 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:36.865355 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.865288 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:36.865355 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.865301 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:36.865355 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.865322 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-217.ec2.internal" Apr 17 07:51:36.874350 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.874330 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-217.ec2.internal" Apr 17 07:51:36.874404 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.874354 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-217.ec2.internal\": node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:36.899098 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.899073 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:36.903369 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.903347 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal"] Apr 17 07:51:36.903427 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.903414 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:36.904240 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.904225 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:36.904291 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.904256 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:36.904291 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.904273 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:36.905737 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.905725 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:36.905889 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.905876 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 07:51:36.905922 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.905906 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:36.906435 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.906418 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:36.906527 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.906442 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:36.906527 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.906420 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:36.906527 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.906475 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:36.906527 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.906484 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:36.906527 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.906453 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:36.907745 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.907730 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 07:51:36.907821 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.907754 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:36.908407 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.908389 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:36.908483 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.908421 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:36.908483 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:36.908434 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:36.939479 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.939452 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-217.ec2.internal\" not found" node="ip-10-0-128-217.ec2.internal" Apr 17 07:51:36.943793 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.943777 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-217.ec2.internal\" not found" node="ip-10-0-128-217.ec2.internal" Apr 17 07:51:36.999353 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:36.999327 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:37.003718 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.003683 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57a323f22f4a50ec542cb175406e5b82-config\") pod \"kube-apiserver-proxy-ip-10-0-128-217.ec2.internal\" (UID: \"57a323f22f4a50ec542cb175406e5b82\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.003774 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.003725 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.003774 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.003745 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.099883 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:37.099846 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:37.104207 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.104188 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.104255 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.104216 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.104255 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.104242 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57a323f22f4a50ec542cb175406e5b82-config\") pod \"kube-apiserver-proxy-ip-10-0-128-217.ec2.internal\" (UID: \"57a323f22f4a50ec542cb175406e5b82\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.104316 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.104295 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.104348 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.104322 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57a323f22f4a50ec542cb175406e5b82-config\") pod \"kube-apiserver-proxy-ip-10-0-128-217.ec2.internal\" (UID: \"57a323f22f4a50ec542cb175406e5b82\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.104348 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.104295 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.200760 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:37.200659 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:37.241077 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.241053 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.246521 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.246497 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.301335 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:37.301299 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:37.401870 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:37.401842 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:37.502464 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:37.502369 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:37.597006 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.596972 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 07:51:37.597649 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.597131 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:37.603222 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:37.603199 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:37.699461 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.699433 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:37.703322 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:37.703303 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 07:51:37.711264 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.711230 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:46:36 +0000 UTC" deadline="2027-10-23 12:28:40.641350561 +0000 UTC" Apr 17 07:51:37.711264 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.711264 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13300h37m2.930091049s" Apr 17 07:51:37.713362 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.713346 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:37.716437 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.716420 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:37.736975 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.736949 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-q7wbg" Apr 17 07:51:37.746737 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.746711 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-q7wbg" Apr 17 07:51:37.759426 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:37.759397 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2807b2563fb554c003c51001f381c040.slice/crio-925c2db664d22f7993d4b9d79ac1059a090034ab5a323b09e3e977ef0adc8013 WatchSource:0}: Error finding container 925c2db664d22f7993d4b9d79ac1059a090034ab5a323b09e3e977ef0adc8013: Status 404 returned error can't find the container with id 925c2db664d22f7993d4b9d79ac1059a090034ab5a323b09e3e977ef0adc8013 Apr 17 07:51:37.759712 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:37.759673 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a323f22f4a50ec542cb175406e5b82.slice/crio-d245aae6c388378ec8957ea6b9b6b2f22ca2147cbb742eaf1a532cca43921f3a WatchSource:0}: Error finding container d245aae6c388378ec8957ea6b9b6b2f22ca2147cbb742eaf1a532cca43921f3a: Status 404 returned error can't find the container with id d245aae6c388378ec8957ea6b9b6b2f22ca2147cbb742eaf1a532cca43921f3a Apr 17 07:51:37.766306 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.766290 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:51:37.800208 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.800165 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.805587 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.805546 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" event={"ID":"57a323f22f4a50ec542cb175406e5b82","Type":"ContainerStarted","Data":"d245aae6c388378ec8957ea6b9b6b2f22ca2147cbb742eaf1a532cca43921f3a"} Apr 17 07:51:37.806507 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.806474 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" event={"ID":"2807b2563fb554c003c51001f381c040","Type":"ContainerStarted","Data":"925c2db664d22f7993d4b9d79ac1059a090034ab5a323b09e3e977ef0adc8013"} Apr 17 07:51:37.807845 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.807827 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:37.814239 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.814225 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:37.815008 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.814996 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 07:51:37.822002 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.821989 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:37.888878 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:37.888853 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:38.473332 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.473291 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:38.678733 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.678684 2566 apiserver.go:52] "Watching apiserver" Apr 17 07:51:38.686493 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.686467 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 07:51:38.687776 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.687751 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-42f72","openshift-cluster-node-tuning-operator/tuned-jpmsl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal","openshift-multus/multus-additional-cni-plugins-b24kh","openshift-multus/multus-fr7nk","openshift-network-operator/iptables-alerter-jbj5m","openshift-ovn-kubernetes/ovnkube-node-f6vls","kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj","openshift-dns/node-resolver-qndp2","openshift-image-registry/node-ca-s2dk9","openshift-multus/network-metrics-daemon-jt4rj","openshift-network-diagnostics/network-check-target-vbl24"] Apr 17 07:51:38.690269 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.690243 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:38.690385 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.690350 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.691592 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.691566 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.693087 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.692986 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 07:51:38.693087 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.693022 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 07:51:38.693087 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.693031 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 07:51:38.693536 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.693513 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hsx9w\"" Apr 17 07:51:38.693641 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.693581 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 07:51:38.693753 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.693738 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 07:51:38.694092 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.692989 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 07:51:38.696633 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.694560 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 07:51:38.696633 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.694997 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:38.696633 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.695170 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 07:51:38.696633 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.696221 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pqf5d\"" Apr 17 07:51:38.696883 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.696826 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:38.697272 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:38.697016 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:38.697557 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.697451 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 07:51:38.697557 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.697544 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cqlwn\"" Apr 17 07:51:38.697934 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.697918 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xmqmw\"" Apr 17 07:51:38.698236 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.698213 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:38.698721 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.698229 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:38.698721 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.698597 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.699605 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.699589 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.700958 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.700773 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.701771 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.701527 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:38.701851 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.701542 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-sdr4g\"" Apr 17 07:51:38.701851 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.701559 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:38.702147 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.702128 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 07:51:38.702238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.702190 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:38.702238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.702209 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 07:51:38.702341 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.702267 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 07:51:38.702341 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.702317 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 07:51:38.702572 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.702554 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 07:51:38.702691 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.702671 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-c9jv7\"" Apr 17 07:51:38.702882 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.702855 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 07:51:38.703074 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.703053 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 07:51:38.703171 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.703126 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qxnxl\"" Apr 17 07:51:38.703229 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.703173 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 07:51:38.703488 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.703457 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 07:51:38.704240 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.704223 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 07:51:38.704655 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.704635 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 07:51:38.704770 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.704743 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mxk9b\"" Apr 17 07:51:38.705020 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.705001 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:38.705115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.705008 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:38.705171 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:38.705149 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:38.706888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.706869 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 07:51:38.707268 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.707252 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tmwwc\"" Apr 17 07:51:38.707360 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.707286 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 07:51:38.707421 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.707283 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 07:51:38.712225 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712201 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-system-cni-dir\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.712316 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712241 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/595aad98-ad8c-469e-ae13-798099e8e67b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.712316 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712271 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-kubelet\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.712316 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712295 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69d806e6-4ecb-42c4-b3a9-57107400f8d5-ovnkube-config\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.712459 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712321 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-registration-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.712459 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712347 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-sys-fs\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.712459 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712372 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7886ccc8-69b5-457b-8e19-3ac5c6d1153c-konnectivity-ca\") pod \"konnectivity-agent-42f72\" (UID: \"7886ccc8-69b5-457b-8e19-3ac5c6d1153c\") " pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:38.712459 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712396 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-conf-dir\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.712459 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712435 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-var-lib-cni-multus\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.712688 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712460 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-run-multus-certs\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.712688 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712501 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ccrz\" (UniqueName: \"kubernetes.io/projected/c9597c98-928c-4e9e-9f6a-20399532f672-kube-api-access-9ccrz\") pod \"node-resolver-qndp2\" (UID: \"c9597c98-928c-4e9e-9f6a-20399532f672\") " pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:38.712688 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712526 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-os-release\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.712688 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712590 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch4lk\" (UniqueName: \"kubernetes.io/projected/595aad98-ad8c-469e-ae13-798099e8e67b-kube-api-access-ch4lk\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.712688 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712629 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-log-socket\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.712688 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712657 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-system-cni-dir\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712686 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-socket-dir-parent\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712727 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af6392d4-cfa0-4d92-a9a8-14dc562724bf-host-slash\") pod \"iptables-alerter-jbj5m\" (UID: \"af6392d4-cfa0-4d92-a9a8-14dc562724bf\") " pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712753 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-sysctl-conf\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712777 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/595aad98-ad8c-469e-ae13-798099e8e67b-cni-binary-copy\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712801 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-run-netns\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712823 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7886ccc8-69b5-457b-8e19-3ac5c6d1153c-agent-certs\") pod \"konnectivity-agent-42f72\" (UID: \"7886ccc8-69b5-457b-8e19-3ac5c6d1153c\") " pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712848 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-etc-kubernetes\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712872 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-var-lib-kubelet\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712897 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c9597c98-928c-4e9e-9f6a-20399532f672-hosts-file\") pod \"node-resolver-qndp2\" (UID: \"c9597c98-928c-4e9e-9f6a-20399532f672\") " pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712925 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-kubelet-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.712986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712967 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-os-release\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.712997 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-var-lib-cni-bin\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713023 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/af6392d4-cfa0-4d92-a9a8-14dc562724bf-iptables-alerter-script\") pod \"iptables-alerter-jbj5m\" (UID: \"af6392d4-cfa0-4d92-a9a8-14dc562724bf\") " pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713070 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-modprobe-d\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713111 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-host\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713141 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-hostroot\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713167 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqx95\" (UniqueName: \"kubernetes.io/projected/4fbea707-d5c2-4c45-82e5-089d272aa922-kube-api-access-hqx95\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713183 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-lib-modules\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713217 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-etc-openvswitch\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713247 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-socket-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713296 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-systemd\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713331 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/595aad98-ad8c-469e-ae13-798099e8e67b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.713368 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713359 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-daemon-config\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713382 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dc99\" (UniqueName: \"kubernetes.io/projected/49cea292-6e01-4497-a4a8-a9cdff76850e-kube-api-access-4dc99\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713404 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-sysctl-d\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713437 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-run\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713461 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69d806e6-4ecb-42c4-b3a9-57107400f8d5-ovnkube-script-lib\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713483 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f847aabc-a956-47e9-91e9-a380ac142ed4-serviceca\") pod \"node-ca-s2dk9\" (UID: \"f847aabc-a956-47e9-91e9-a380ac142ed4\") " pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713505 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-cni-dir\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713530 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49cea292-6e01-4497-a4a8-a9cdff76850e-cni-binary-copy\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713558 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-run-netns\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713579 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-kubernetes\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713604 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-var-lib-openvswitch\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713627 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69d806e6-4ecb-42c4-b3a9-57107400f8d5-env-overrides\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713653 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69d806e6-4ecb-42c4-b3a9-57107400f8d5-ovn-node-metrics-cert\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713676 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk6m2\" (UniqueName: \"kubernetes.io/projected/f847aabc-a956-47e9-91e9-a380ac142ed4-kube-api-access-zk6m2\") pod \"node-ca-s2dk9\" (UID: \"f847aabc-a956-47e9-91e9-a380ac142ed4\") " pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713729 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-sysconfig\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713752 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-systemd-units\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713770 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.713828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713791 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-run-k8s-cni-cncf-io\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713815 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713862 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-cnibin\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713906 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-slash\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713933 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713958 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzc5p\" (UniqueName: \"kubernetes.io/projected/69d806e6-4ecb-42c4-b3a9-57107400f8d5-kube-api-access-gzc5p\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.713994 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-etc-selinux\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714025 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sp2q\" (UniqueName: \"kubernetes.io/projected/7ccb8775-fa41-46b2-9f1e-0aa964d80116-kube-api-access-5sp2q\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714068 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-node-log\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714087 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-cni-netd\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714101 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-device-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714115 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-cnibin\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714128 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-var-lib-kubelet\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714143 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5nls\" (UniqueName: \"kubernetes.io/projected/af6392d4-cfa0-4d92-a9a8-14dc562724bf-kube-api-access-t5nls\") pod \"iptables-alerter-jbj5m\" (UID: \"af6392d4-cfa0-4d92-a9a8-14dc562724bf\") " pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714157 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-run-openvswitch\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714171 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f847aabc-a956-47e9-91e9-a380ac142ed4-host\") pod \"node-ca-s2dk9\" (UID: \"f847aabc-a956-47e9-91e9-a380ac142ed4\") " pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:38.714408 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-sys\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.714949 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714216 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-cni-bin\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.714949 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714255 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-run-systemd\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.714949 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714283 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-run-ovn\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.714949 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714305 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/211b08a9-435f-4d0f-8132-af6810956cb8-tmp\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.714949 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714328 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.714949 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714355 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/211b08a9-435f-4d0f-8132-af6810956cb8-etc-tuned\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.714949 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714392 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7hh8\" (UniqueName: \"kubernetes.io/projected/211b08a9-435f-4d0f-8132-af6810956cb8-kube-api-access-b7hh8\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.714949 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.714427 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c9597c98-928c-4e9e-9f6a-20399532f672-tmp-dir\") pod \"node-resolver-qndp2\" (UID: \"c9597c98-928c-4e9e-9f6a-20399532f672\") " pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:38.747610 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.747286 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:37 +0000 UTC" deadline="2027-09-26 22:33:57.593946266 +0000 UTC" Apr 17 07:51:38.747610 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.747320 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12662h42m18.846629411s" Apr 17 07:51:38.804034 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.804001 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 07:51:38.815337 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815301 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-cni-bin\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.815337 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815341 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-run-systemd\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.815564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815367 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-run-ovn\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.815564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815394 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/211b08a9-435f-4d0f-8132-af6810956cb8-tmp\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.815564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815417 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.815564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815434 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/211b08a9-435f-4d0f-8132-af6810956cb8-etc-tuned\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.815564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815457 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7hh8\" (UniqueName: \"kubernetes.io/projected/211b08a9-435f-4d0f-8132-af6810956cb8-kube-api-access-b7hh8\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.815564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815482 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c9597c98-928c-4e9e-9f6a-20399532f672-tmp-dir\") pod \"node-resolver-qndp2\" (UID: \"c9597c98-928c-4e9e-9f6a-20399532f672\") " pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:38.815564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815508 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-system-cni-dir\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815587 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/595aad98-ad8c-469e-ae13-798099e8e67b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815594 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-run-systemd\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815618 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-kubelet\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815638 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69d806e6-4ecb-42c4-b3a9-57107400f8d5-ovnkube-config\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815663 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-registration-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815689 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-sys-fs\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7886ccc8-69b5-457b-8e19-3ac5c6d1153c-konnectivity-ca\") pod \"konnectivity-agent-42f72\" (UID: \"7886ccc8-69b5-457b-8e19-3ac5c6d1153c\") " pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815752 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-conf-dir\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815753 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-system-cni-dir\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815764 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815776 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-var-lib-cni-multus\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815815 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-run-ovn\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815818 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-run-multus-certs\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815826 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-var-lib-cni-multus\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815851 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ccrz\" (UniqueName: \"kubernetes.io/projected/c9597c98-928c-4e9e-9f6a-20399532f672-kube-api-access-9ccrz\") pod \"node-resolver-qndp2\" (UID: \"c9597c98-928c-4e9e-9f6a-20399532f672\") " pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815856 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c9597c98-928c-4e9e-9f6a-20399532f672-tmp-dir\") pod \"node-resolver-qndp2\" (UID: \"c9597c98-928c-4e9e-9f6a-20399532f672\") " pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815895 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-conf-dir\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.815954 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815920 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815932 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-run-multus-certs\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815928 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-kubelet\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815954 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-registration-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815956 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-os-release\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.815984 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-sys-fs\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816004 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch4lk\" (UniqueName: \"kubernetes.io/projected/595aad98-ad8c-469e-ae13-798099e8e67b-kube-api-access-ch4lk\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816016 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-os-release\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816031 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-log-socket\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816054 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-cni-bin\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816055 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-system-cni-dir\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816097 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-socket-dir-parent\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816121 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af6392d4-cfa0-4d92-a9a8-14dc562724bf-host-slash\") pod \"iptables-alerter-jbj5m\" (UID: \"af6392d4-cfa0-4d92-a9a8-14dc562724bf\") " pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816155 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-sysctl-conf\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816181 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/595aad98-ad8c-469e-ae13-798099e8e67b-cni-binary-copy\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816224 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-run-netns\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816229 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-socket-dir-parent\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.816755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816248 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7886ccc8-69b5-457b-8e19-3ac5c6d1153c-agent-certs\") pod \"konnectivity-agent-42f72\" (UID: \"7886ccc8-69b5-457b-8e19-3ac5c6d1153c\") " pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816257 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/595aad98-ad8c-469e-ae13-798099e8e67b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816272 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-etc-kubernetes\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816297 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-var-lib-kubelet\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816335 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c9597c98-928c-4e9e-9f6a-20399532f672-hosts-file\") pod \"node-resolver-qndp2\" (UID: \"c9597c98-928c-4e9e-9f6a-20399532f672\") " pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816341 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-var-lib-kubelet\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816361 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-sysctl-conf\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816367 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-kubelet-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816393 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-os-release\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816403 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af6392d4-cfa0-4d92-a9a8-14dc562724bf-host-slash\") pod \"iptables-alerter-jbj5m\" (UID: \"af6392d4-cfa0-4d92-a9a8-14dc562724bf\") " pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816419 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-var-lib-cni-bin\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816424 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69d806e6-4ecb-42c4-b3a9-57107400f8d5-ovnkube-config\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816445 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/af6392d4-cfa0-4d92-a9a8-14dc562724bf-iptables-alerter-script\") pod \"iptables-alerter-jbj5m\" (UID: \"af6392d4-cfa0-4d92-a9a8-14dc562724bf\") " pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816480 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-log-socket\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816481 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-modprobe-d\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816515 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-host\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816540 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-hostroot\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.817541 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816570 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqx95\" (UniqueName: \"kubernetes.io/projected/4fbea707-d5c2-4c45-82e5-089d272aa922-kube-api-access-hqx95\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816582 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-modprobe-d\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816597 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-lib-modules\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816603 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7886ccc8-69b5-457b-8e19-3ac5c6d1153c-konnectivity-ca\") pod \"konnectivity-agent-42f72\" (UID: \"7886ccc8-69b5-457b-8e19-3ac5c6d1153c\") " pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816621 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-etc-openvswitch\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816637 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c9597c98-928c-4e9e-9f6a-20399532f672-hosts-file\") pod \"node-resolver-qndp2\" (UID: \"c9597c98-928c-4e9e-9f6a-20399532f672\") " pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816648 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-socket-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816666 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-system-cni-dir\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816673 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-systemd\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816685 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-kubelet-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816717 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/595aad98-ad8c-469e-ae13-798099e8e67b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816744 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-daemon-config\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816753 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-host\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816772 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dc99\" (UniqueName: \"kubernetes.io/projected/49cea292-6e01-4497-a4a8-a9cdff76850e-kube-api-access-4dc99\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816777 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-hostroot\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816785 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/595aad98-ad8c-469e-ae13-798099e8e67b-cni-binary-copy\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816795 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-sysctl-d\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816823 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-run\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.818415 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816834 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-run-netns\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816851 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69d806e6-4ecb-42c4-b3a9-57107400f8d5-ovnkube-script-lib\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816880 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f847aabc-a956-47e9-91e9-a380ac142ed4-serviceca\") pod \"node-ca-s2dk9\" (UID: \"f847aabc-a956-47e9-91e9-a380ac142ed4\") " pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816885 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-sysctl-d\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816914 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-lib-modules\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816918 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-cni-dir\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816948 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49cea292-6e01-4497-a4a8-a9cdff76850e-cni-binary-copy\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816983 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-run-netns\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817010 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-kubernetes\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817042 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhcnb\" (UniqueName: \"kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb\") pod \"network-check-target-vbl24\" (UID: \"648c562e-66a8-4487-995e-2e06a13a92a5\") " pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817072 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-var-lib-openvswitch\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817093 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-socket-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817097 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69d806e6-4ecb-42c4-b3a9-57107400f8d5-env-overrides\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816744 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-var-lib-cni-bin\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817149 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69d806e6-4ecb-42c4-b3a9-57107400f8d5-ovn-node-metrics-cert\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817182 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk6m2\" (UniqueName: \"kubernetes.io/projected/f847aabc-a956-47e9-91e9-a380ac142ed4-kube-api-access-zk6m2\") pod \"node-ca-s2dk9\" (UID: \"f847aabc-a956-47e9-91e9-a380ac142ed4\") " pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817206 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-sysconfig\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.819238 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817230 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-systemd-units\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816983 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-etc-kubernetes\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817252 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f847aabc-a956-47e9-91e9-a380ac142ed4-serviceca\") pod \"node-ca-s2dk9\" (UID: \"f847aabc-a956-47e9-91e9-a380ac142ed4\") " pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817316 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-os-release\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817319 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-cni-dir\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817256 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817366 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-kubernetes\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.816985 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-run\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817398 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-run-k8s-cni-cncf-io\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817439 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-run-k8s-cni-cncf-io\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817438 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817476 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-cnibin\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:38.817511 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817531 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-slash\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817559 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817597 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:38.817614 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs podName:4fbea707-d5c2-4c45-82e5-089d272aa922 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:39.317567609 +0000 UTC m=+3.105346209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs") pod "network-metrics-daemon-jt4rj" (UID: "4fbea707-d5c2-4c45-82e5-089d272aa922") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:38.820115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817643 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-systemd\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817752 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/595aad98-ad8c-469e-ae13-798099e8e67b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817760 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-var-lib-openvswitch\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817784 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817828 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/595aad98-ad8c-469e-ae13-798099e8e67b-cnibin\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817859 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49cea292-6e01-4497-a4a8-a9cdff76850e-cni-binary-copy\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817866 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-slash\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817913 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-run-netns\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.817986 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzc5p\" (UniqueName: \"kubernetes.io/projected/69d806e6-4ecb-42c4-b3a9-57107400f8d5-kube-api-access-gzc5p\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818019 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-etc-selinux\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818047 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sp2q\" (UniqueName: \"kubernetes.io/projected/7ccb8775-fa41-46b2-9f1e-0aa964d80116-kube-api-access-5sp2q\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818071 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-node-log\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818096 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-cni-netd\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818124 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-device-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818172 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-etc-openvswitch\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818193 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69d806e6-4ecb-42c4-b3a9-57107400f8d5-env-overrides\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818224 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-device-dir\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.820962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818253 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-cnibin\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818284 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-var-lib-kubelet\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818311 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5nls\" (UniqueName: \"kubernetes.io/projected/af6392d4-cfa0-4d92-a9a8-14dc562724bf-kube-api-access-t5nls\") pod \"iptables-alerter-jbj5m\" (UID: \"af6392d4-cfa0-4d92-a9a8-14dc562724bf\") " pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818338 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-run-openvswitch\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818363 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f847aabc-a956-47e9-91e9-a380ac142ed4-host\") pod \"node-ca-s2dk9\" (UID: \"f847aabc-a956-47e9-91e9-a380ac142ed4\") " pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818388 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-sys\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818425 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7ccb8775-fa41-46b2-9f1e-0aa964d80116-etc-selinux\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818444 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69d806e6-4ecb-42c4-b3a9-57107400f8d5-ovnkube-script-lib\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818463 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-sys\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818482 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/af6392d4-cfa0-4d92-a9a8-14dc562724bf-iptables-alerter-script\") pod \"iptables-alerter-jbj5m\" (UID: \"af6392d4-cfa0-4d92-a9a8-14dc562724bf\") " pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818489 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-node-log\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818498 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-cnibin\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818511 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-host-cni-netd\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818532 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-run-openvswitch\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818544 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49cea292-6e01-4497-a4a8-a9cdff76850e-host-var-lib-kubelet\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818582 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f847aabc-a956-47e9-91e9-a380ac142ed4-host\") pod \"node-ca-s2dk9\" (UID: \"f847aabc-a956-47e9-91e9-a380ac142ed4\") " pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818627 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/211b08a9-435f-4d0f-8132-af6810956cb8-etc-sysconfig\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818662 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69d806e6-4ecb-42c4-b3a9-57107400f8d5-systemd-units\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.821792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.818939 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/211b08a9-435f-4d0f-8132-af6810956cb8-etc-tuned\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.822667 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.819269 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/49cea292-6e01-4497-a4a8-a9cdff76850e-multus-daemon-config\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.822667 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.819293 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/211b08a9-435f-4d0f-8132-af6810956cb8-tmp\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.822667 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.820757 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7886ccc8-69b5-457b-8e19-3ac5c6d1153c-agent-certs\") pod \"konnectivity-agent-42f72\" (UID: \"7886ccc8-69b5-457b-8e19-3ac5c6d1153c\") " pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:38.822667 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.822012 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69d806e6-4ecb-42c4-b3a9-57107400f8d5-ovn-node-metrics-cert\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.824792 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.824766 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ccrz\" (UniqueName: \"kubernetes.io/projected/c9597c98-928c-4e9e-9f6a-20399532f672-kube-api-access-9ccrz\") pod \"node-resolver-qndp2\" (UID: \"c9597c98-928c-4e9e-9f6a-20399532f672\") " pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:38.825058 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.825034 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7hh8\" (UniqueName: \"kubernetes.io/projected/211b08a9-435f-4d0f-8132-af6810956cb8-kube-api-access-b7hh8\") pod \"tuned-jpmsl\" (UID: \"211b08a9-435f-4d0f-8132-af6810956cb8\") " pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:38.826058 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.826035 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqx95\" (UniqueName: \"kubernetes.io/projected/4fbea707-d5c2-4c45-82e5-089d272aa922-kube-api-access-hqx95\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:38.826440 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.826418 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dc99\" (UniqueName: \"kubernetes.io/projected/49cea292-6e01-4497-a4a8-a9cdff76850e-kube-api-access-4dc99\") pod \"multus-fr7nk\" (UID: \"49cea292-6e01-4497-a4a8-a9cdff76850e\") " pod="openshift-multus/multus-fr7nk" Apr 17 07:51:38.829134 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.829113 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk6m2\" (UniqueName: \"kubernetes.io/projected/f847aabc-a956-47e9-91e9-a380ac142ed4-kube-api-access-zk6m2\") pod \"node-ca-s2dk9\" (UID: \"f847aabc-a956-47e9-91e9-a380ac142ed4\") " pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:38.829273 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.829251 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sp2q\" (UniqueName: \"kubernetes.io/projected/7ccb8775-fa41-46b2-9f1e-0aa964d80116-kube-api-access-5sp2q\") pod \"aws-ebs-csi-driver-node-99ffj\" (UID: \"7ccb8775-fa41-46b2-9f1e-0aa964d80116\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:38.830035 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.830011 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5nls\" (UniqueName: \"kubernetes.io/projected/af6392d4-cfa0-4d92-a9a8-14dc562724bf-kube-api-access-t5nls\") pod \"iptables-alerter-jbj5m\" (UID: \"af6392d4-cfa0-4d92-a9a8-14dc562724bf\") " pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:38.830522 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.830503 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzc5p\" (UniqueName: \"kubernetes.io/projected/69d806e6-4ecb-42c4-b3a9-57107400f8d5-kube-api-access-gzc5p\") pod \"ovnkube-node-f6vls\" (UID: \"69d806e6-4ecb-42c4-b3a9-57107400f8d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:38.830662 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.830641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch4lk\" (UniqueName: \"kubernetes.io/projected/595aad98-ad8c-469e-ae13-798099e8e67b-kube-api-access-ch4lk\") pod \"multus-additional-cni-plugins-b24kh\" (UID: \"595aad98-ad8c-469e-ae13-798099e8e67b\") " pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:38.918795 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:38.918740 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhcnb\" (UniqueName: \"kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb\") pod \"network-check-target-vbl24\" (UID: \"648c562e-66a8-4487-995e-2e06a13a92a5\") " pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:38.924923 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:38.924898 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:38.924923 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:38.924925 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:38.925127 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:38.924937 2566 projected.go:194] Error preparing data for projected volume kube-api-access-vhcnb for pod openshift-network-diagnostics/network-check-target-vbl24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:38.925127 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:38.925002 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb podName:648c562e-66a8-4487-995e-2e06a13a92a5 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:39.424987865 +0000 UTC m=+3.212766445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vhcnb" (UniqueName: "kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb") pod "network-check-target-vbl24" (UID: "648c562e-66a8-4487-995e-2e06a13a92a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:39.004581 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.004493 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qndp2" Apr 17 07:51:39.012346 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.012319 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b24kh" Apr 17 07:51:39.021156 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.021130 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fr7nk" Apr 17 07:51:39.026124 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.026095 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jbj5m" Apr 17 07:51:39.032745 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.032722 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" Apr 17 07:51:39.040382 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.040358 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:51:39.053056 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.053032 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" Apr 17 07:51:39.057608 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.057588 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:39.062141 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.062122 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s2dk9" Apr 17 07:51:39.321117 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.321082 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:39.321286 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:39.321224 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:39.321355 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:39.321293 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs podName:4fbea707-d5c2-4c45-82e5-089d272aa922 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:40.321275051 +0000 UTC m=+4.109053643 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs") pod "network-metrics-daemon-jt4rj" (UID: "4fbea707-d5c2-4c45-82e5-089d272aa922") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:39.360278 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:39.360246 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf6392d4_cfa0_4d92_a9a8_14dc562724bf.slice/crio-77db480d2afeb3959fdcf240c571acb4bfa14aa423c85770f4c9bb689305c34d WatchSource:0}: Error finding container 77db480d2afeb3959fdcf240c571acb4bfa14aa423c85770f4c9bb689305c34d: Status 404 returned error can't find the container with id 77db480d2afeb3959fdcf240c571acb4bfa14aa423c85770f4c9bb689305c34d Apr 17 07:51:39.361525 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:39.361492 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod211b08a9_435f_4d0f_8132_af6810956cb8.slice/crio-bab5999421b5bd19a5262b2e4b6ad53ceb4e4e47ae007b3ce56f5db496dbc6fc WatchSource:0}: Error finding container bab5999421b5bd19a5262b2e4b6ad53ceb4e4e47ae007b3ce56f5db496dbc6fc: Status 404 returned error can't find the container with id bab5999421b5bd19a5262b2e4b6ad53ceb4e4e47ae007b3ce56f5db496dbc6fc Apr 17 07:51:39.363924 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:39.363854 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod595aad98_ad8c_469e_ae13_798099e8e67b.slice/crio-32e7f8c3fba811284626c5f7bdc1c7cbf5779e895e51367e7907d5b1dc3c63c8 WatchSource:0}: Error finding container 32e7f8c3fba811284626c5f7bdc1c7cbf5779e895e51367e7907d5b1dc3c63c8: Status 404 returned error can't find the container with id 32e7f8c3fba811284626c5f7bdc1c7cbf5779e895e51367e7907d5b1dc3c63c8 Apr 17 07:51:39.366649 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:39.366627 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69d806e6_4ecb_42c4_b3a9_57107400f8d5.slice/crio-9f935b4929aa393021001d64075d6e7d37b1d401e9cf3ddcf89de12f18bc1372 WatchSource:0}: Error finding container 9f935b4929aa393021001d64075d6e7d37b1d401e9cf3ddcf89de12f18bc1372: Status 404 returned error can't find the container with id 9f935b4929aa393021001d64075d6e7d37b1d401e9cf3ddcf89de12f18bc1372 Apr 17 07:51:39.367385 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:39.367360 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49cea292_6e01_4497_a4a8_a9cdff76850e.slice/crio-ff9224b655d69aed51a6ec44f217fe397f35f0a42e68306c67d5aee34a34ceb0 WatchSource:0}: Error finding container ff9224b655d69aed51a6ec44f217fe397f35f0a42e68306c67d5aee34a34ceb0: Status 404 returned error can't find the container with id ff9224b655d69aed51a6ec44f217fe397f35f0a42e68306c67d5aee34a34ceb0 Apr 17 07:51:39.368393 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:39.368367 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7886ccc8_69b5_457b_8e19_3ac5c6d1153c.slice/crio-d3675944b668844311ae327b274659bc94b1569e0444624527ea5a13db58eeb7 WatchSource:0}: Error finding container d3675944b668844311ae327b274659bc94b1569e0444624527ea5a13db58eeb7: Status 404 returned error can't find the container with id d3675944b668844311ae327b274659bc94b1569e0444624527ea5a13db58eeb7 Apr 17 07:51:39.369785 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:39.369481 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf847aabc_a956_47e9_91e9_a380ac142ed4.slice/crio-b8f6e83d4d1b33e4adcbe107e2735f800dac9b84c09b511b9d93f779729bf831 WatchSource:0}: Error finding container b8f6e83d4d1b33e4adcbe107e2735f800dac9b84c09b511b9d93f779729bf831: Status 404 returned error can't find the container with id b8f6e83d4d1b33e4adcbe107e2735f800dac9b84c09b511b9d93f779729bf831 Apr 17 07:51:39.370988 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:39.370788 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ccb8775_fa41_46b2_9f1e_0aa964d80116.slice/crio-6aad2cb165ebd61654f46cafb885819748a0f4c36f026b9c24236fbf21093b00 WatchSource:0}: Error finding container 6aad2cb165ebd61654f46cafb885819748a0f4c36f026b9c24236fbf21093b00: Status 404 returned error can't find the container with id 6aad2cb165ebd61654f46cafb885819748a0f4c36f026b9c24236fbf21093b00 Apr 17 07:51:39.371213 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:51:39.371144 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9597c98_928c_4e9e_9f6a_20399532f672.slice/crio-ccf88927262c71df6a7a3b542f8b13f17f066a2b2f04108a9e14f9a47ea9e6fd WatchSource:0}: Error finding container ccf88927262c71df6a7a3b542f8b13f17f066a2b2f04108a9e14f9a47ea9e6fd: Status 404 returned error can't find the container with id ccf88927262c71df6a7a3b542f8b13f17f066a2b2f04108a9e14f9a47ea9e6fd Apr 17 07:51:39.522433 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.522394 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhcnb\" (UniqueName: \"kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb\") pod \"network-check-target-vbl24\" (UID: \"648c562e-66a8-4487-995e-2e06a13a92a5\") " pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:39.522577 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:39.522556 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:39.522630 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:39.522581 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:39.522630 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:39.522591 2566 projected.go:194] Error preparing data for projected volume kube-api-access-vhcnb for pod openshift-network-diagnostics/network-check-target-vbl24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:39.522716 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:39.522645 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb podName:648c562e-66a8-4487-995e-2e06a13a92a5 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:40.522629896 +0000 UTC m=+4.310408478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhcnb" (UniqueName: "kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb") pod "network-check-target-vbl24" (UID: "648c562e-66a8-4487-995e-2e06a13a92a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:39.748187 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.748027 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:37 +0000 UTC" deadline="2028-01-13 19:15:43.333646808 +0000 UTC" Apr 17 07:51:39.748187 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.748060 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15275h24m3.585590077s" Apr 17 07:51:39.803814 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.803312 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:39.803814 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:39.803447 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:39.821534 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.821496 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" event={"ID":"57a323f22f4a50ec542cb175406e5b82","Type":"ContainerStarted","Data":"b66fb719f6dcfd37d83d259194da3ab62b18981d5509b043c79329d848e74e58"} Apr 17 07:51:39.824306 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.824068 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qndp2" event={"ID":"c9597c98-928c-4e9e-9f6a-20399532f672","Type":"ContainerStarted","Data":"ccf88927262c71df6a7a3b542f8b13f17f066a2b2f04108a9e14f9a47ea9e6fd"} Apr 17 07:51:39.827902 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.827843 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" event={"ID":"7ccb8775-fa41-46b2-9f1e-0aa964d80116","Type":"ContainerStarted","Data":"6aad2cb165ebd61654f46cafb885819748a0f4c36f026b9c24236fbf21093b00"} Apr 17 07:51:39.830281 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.830234 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s2dk9" event={"ID":"f847aabc-a956-47e9-91e9-a380ac142ed4","Type":"ContainerStarted","Data":"b8f6e83d4d1b33e4adcbe107e2735f800dac9b84c09b511b9d93f779729bf831"} Apr 17 07:51:39.832817 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.832527 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-42f72" event={"ID":"7886ccc8-69b5-457b-8e19-3ac5c6d1153c","Type":"ContainerStarted","Data":"d3675944b668844311ae327b274659bc94b1569e0444624527ea5a13db58eeb7"} Apr 17 07:51:39.835072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.835021 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b24kh" event={"ID":"595aad98-ad8c-469e-ae13-798099e8e67b","Type":"ContainerStarted","Data":"32e7f8c3fba811284626c5f7bdc1c7cbf5779e895e51367e7907d5b1dc3c63c8"} Apr 17 07:51:39.837615 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.837374 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" event={"ID":"211b08a9-435f-4d0f-8132-af6810956cb8","Type":"ContainerStarted","Data":"bab5999421b5bd19a5262b2e4b6ad53ceb4e4e47ae007b3ce56f5db496dbc6fc"} Apr 17 07:51:39.841045 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.840992 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr7nk" event={"ID":"49cea292-6e01-4497-a4a8-a9cdff76850e","Type":"ContainerStarted","Data":"ff9224b655d69aed51a6ec44f217fe397f35f0a42e68306c67d5aee34a34ceb0"} Apr 17 07:51:39.842740 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.842679 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerStarted","Data":"9f935b4929aa393021001d64075d6e7d37b1d401e9cf3ddcf89de12f18bc1372"} Apr 17 07:51:39.847966 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:39.847943 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jbj5m" event={"ID":"af6392d4-cfa0-4d92-a9a8-14dc562724bf","Type":"ContainerStarted","Data":"77db480d2afeb3959fdcf240c571acb4bfa14aa423c85770f4c9bb689305c34d"} Apr 17 07:51:40.328021 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:40.327499 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:40.328021 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:40.327632 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:40.328021 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:40.327710 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs podName:4fbea707-d5c2-4c45-82e5-089d272aa922 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:42.327675447 +0000 UTC m=+6.115454034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs") pod "network-metrics-daemon-jt4rj" (UID: "4fbea707-d5c2-4c45-82e5-089d272aa922") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:40.529578 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:40.529533 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhcnb\" (UniqueName: \"kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb\") pod \"network-check-target-vbl24\" (UID: \"648c562e-66a8-4487-995e-2e06a13a92a5\") " pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:40.530180 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:40.530153 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:40.530180 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:40.530181 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:40.530335 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:40.530196 2566 projected.go:194] Error preparing data for projected volume kube-api-access-vhcnb for pod openshift-network-diagnostics/network-check-target-vbl24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:40.530335 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:40.530260 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb podName:648c562e-66a8-4487-995e-2e06a13a92a5 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:42.53024157 +0000 UTC m=+6.318020164 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhcnb" (UniqueName: "kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb") pod "network-check-target-vbl24" (UID: "648c562e-66a8-4487-995e-2e06a13a92a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:40.803840 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:40.803251 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:40.803840 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:40.803401 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:40.862171 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:40.862130 2566 generic.go:358] "Generic (PLEG): container finished" podID="2807b2563fb554c003c51001f381c040" containerID="a0947139cbdfeae5188e88960644cd9abdb86b5bf1394a94b560f81f547b665e" exitCode=0 Apr 17 07:51:40.862771 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:40.862746 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" event={"ID":"2807b2563fb554c003c51001f381c040","Type":"ContainerDied","Data":"a0947139cbdfeae5188e88960644cd9abdb86b5bf1394a94b560f81f547b665e"} Apr 17 07:51:40.880670 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:40.880206 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" podStartSLOduration=3.8801855290000002 podStartE2EDuration="3.880185529s" podCreationTimestamp="2026-04-17 07:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:39.837509305 +0000 UTC m=+3.625287909" watchObservedRunningTime="2026-04-17 07:51:40.880185529 +0000 UTC m=+4.667964134" Apr 17 07:51:41.804059 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:41.803523 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:41.804059 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:41.803655 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:41.869021 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:41.868415 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" event={"ID":"2807b2563fb554c003c51001f381c040","Type":"ContainerStarted","Data":"575f25b3ef1ca8e11c036f1a5575710c1572c6003f7169f7849ae4bd5f9bd284"} Apr 17 07:51:42.343369 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:42.343266 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:42.343560 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:42.343414 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:42.343560 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:42.343495 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs podName:4fbea707-d5c2-4c45-82e5-089d272aa922 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:46.343474958 +0000 UTC m=+10.131253553 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs") pod "network-metrics-daemon-jt4rj" (UID: "4fbea707-d5c2-4c45-82e5-089d272aa922") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:42.544466 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:42.544416 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhcnb\" (UniqueName: \"kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb\") pod \"network-check-target-vbl24\" (UID: \"648c562e-66a8-4487-995e-2e06a13a92a5\") " pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:42.544649 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:42.544596 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:42.544649 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:42.544614 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:42.544649 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:42.544626 2566 projected.go:194] Error preparing data for projected volume kube-api-access-vhcnb for pod openshift-network-diagnostics/network-check-target-vbl24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:42.544876 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:42.544684 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb podName:648c562e-66a8-4487-995e-2e06a13a92a5 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:46.544664878 +0000 UTC m=+10.332443463 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhcnb" (UniqueName: "kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb") pod "network-check-target-vbl24" (UID: "648c562e-66a8-4487-995e-2e06a13a92a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:42.803004 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:42.802967 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:42.803163 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:42.803116 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:43.803150 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:43.802669 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:43.803150 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:43.802806 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:44.803102 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:44.803066 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:44.803389 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:44.803218 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:45.803231 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:45.803196 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:45.803422 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:45.803321 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:46.376562 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:46.376345 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:46.376562 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:46.376513 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:46.376841 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:46.376578 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs podName:4fbea707-d5c2-4c45-82e5-089d272aa922 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:54.37656 +0000 UTC m=+18.164338585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs") pod "network-metrics-daemon-jt4rj" (UID: "4fbea707-d5c2-4c45-82e5-089d272aa922") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:46.578380 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:46.578339 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhcnb\" (UniqueName: \"kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb\") pod \"network-check-target-vbl24\" (UID: \"648c562e-66a8-4487-995e-2e06a13a92a5\") " pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:46.578559 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:46.578516 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:46.578559 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:46.578530 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:46.578559 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:46.578539 2566 projected.go:194] Error preparing data for projected volume kube-api-access-vhcnb for pod openshift-network-diagnostics/network-check-target-vbl24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:46.578738 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:46.578589 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb podName:648c562e-66a8-4487-995e-2e06a13a92a5 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:54.578576741 +0000 UTC m=+18.366355327 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhcnb" (UniqueName: "kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb") pod "network-check-target-vbl24" (UID: "648c562e-66a8-4487-995e-2e06a13a92a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:46.804047 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:46.804012 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:46.804511 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:46.804160 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:47.802900 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:47.802865 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:47.803091 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:47.802987 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:48.802870 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:48.802841 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:48.803303 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:48.802977 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:49.803623 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:49.803593 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:49.804079 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:49.803727 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:50.803245 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:50.803212 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:50.803411 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:50.803349 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:51.803552 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:51.803517 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:51.804012 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:51.803623 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:52.803529 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:52.803494 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:52.803732 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:52.803649 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:53.803450 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:53.803413 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:53.803678 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:53.803543 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:54.427859 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:54.427822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:54.428317 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:54.427978 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:54.428317 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:54.428049 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs podName:4fbea707-d5c2-4c45-82e5-089d272aa922 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:10.428030266 +0000 UTC m=+34.215808850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs") pod "network-metrics-daemon-jt4rj" (UID: "4fbea707-d5c2-4c45-82e5-089d272aa922") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:54.629174 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:54.629137 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhcnb\" (UniqueName: \"kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb\") pod \"network-check-target-vbl24\" (UID: \"648c562e-66a8-4487-995e-2e06a13a92a5\") " pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:54.629326 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:54.629305 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:54.629395 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:54.629329 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:54.629395 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:54.629340 2566 projected.go:194] Error preparing data for projected volume kube-api-access-vhcnb for pod openshift-network-diagnostics/network-check-target-vbl24: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:54.629395 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:54.629394 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb podName:648c562e-66a8-4487-995e-2e06a13a92a5 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:10.629376693 +0000 UTC m=+34.417155273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vhcnb" (UniqueName: "kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb") pod "network-check-target-vbl24" (UID: "648c562e-66a8-4487-995e-2e06a13a92a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:54.803667 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:54.803632 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:54.803845 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:54.803785 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:55.802971 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:55.802932 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:55.803471 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:55.803061 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:56.806630 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.806607 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:56.807368 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:56.806810 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:56.894532 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.894499 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qndp2" event={"ID":"c9597c98-928c-4e9e-9f6a-20399532f672","Type":"ContainerStarted","Data":"58ef5d242cb189b5c8982e8fd51eb9d10a4bc823d566ed7b2c48da7f50c2622c"} Apr 17 07:51:56.895630 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.895603 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" event={"ID":"7ccb8775-fa41-46b2-9f1e-0aa964d80116","Type":"ContainerStarted","Data":"b23d0799284bb1a936e7526ba324e8408e4b98935e7ba774b04d991c69eb536f"} Apr 17 07:51:56.896872 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.896850 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s2dk9" event={"ID":"f847aabc-a956-47e9-91e9-a380ac142ed4","Type":"ContainerStarted","Data":"6159b29c6a47922c87d3d9220dae31e299a51a8941a45833c5085d53e834ae1f"} Apr 17 07:51:56.897926 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.897903 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-42f72" event={"ID":"7886ccc8-69b5-457b-8e19-3ac5c6d1153c","Type":"ContainerStarted","Data":"b1f2fe65fd51334e53ac9c9959e0ac42d47e7f594c4d447b9f66856ac0f99054"} Apr 17 07:51:56.899028 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.899006 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b24kh" event={"ID":"595aad98-ad8c-469e-ae13-798099e8e67b","Type":"ContainerStarted","Data":"1c905a15bd5f84238462f0340e92320a58d3d69e7ddd78fd04252ac8be7844fb"} Apr 17 07:51:56.900194 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.900172 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" event={"ID":"211b08a9-435f-4d0f-8132-af6810956cb8","Type":"ContainerStarted","Data":"e70daa7636f7f8617c46c18e6a9e9762066e00894164dd83360767ed7b310d3e"} Apr 17 07:51:56.901297 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.901277 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr7nk" event={"ID":"49cea292-6e01-4497-a4a8-a9cdff76850e","Type":"ContainerStarted","Data":"744b0ff63d66c9ee2e2b3df293e287f8d6b02e0de7cce48d634924992c9f9dde"} Apr 17 07:51:56.902639 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.902620 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerStarted","Data":"dea6657c9e467c55d89a083b32f01fb62c8e51fc48f1c88142822880cf1afa13"} Apr 17 07:51:56.902762 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.902646 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerStarted","Data":"3a60b677569a39d8f199a85f1b127f05431e1cd7e55445ba5ef55f58f701ba0d"} Apr 17 07:51:56.935100 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.934923 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" podStartSLOduration=19.934908969 podStartE2EDuration="19.934908969s" podCreationTimestamp="2026-04-17 07:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:41.882499823 +0000 UTC m=+5.670278427" watchObservedRunningTime="2026-04-17 07:51:56.934908969 +0000 UTC m=+20.722687602" Apr 17 07:51:56.958085 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.958045 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jpmsl" podStartSLOduration=2.877633513 podStartE2EDuration="19.958031482s" podCreationTimestamp="2026-04-17 07:51:37 +0000 UTC" firstStartedPulling="2026-04-17 07:51:39.364146375 +0000 UTC m=+3.151924968" lastFinishedPulling="2026-04-17 07:51:56.444544344 +0000 UTC m=+20.232322937" observedRunningTime="2026-04-17 07:51:56.957669675 +0000 UTC m=+20.745448275" watchObservedRunningTime="2026-04-17 07:51:56.958031482 +0000 UTC m=+20.745810084" Apr 17 07:51:56.991979 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:56.991936 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fr7nk" podStartSLOduration=3.916602716 podStartE2EDuration="20.991921552s" podCreationTimestamp="2026-04-17 07:51:36 +0000 UTC" firstStartedPulling="2026-04-17 07:51:39.369715946 +0000 UTC m=+3.157494533" lastFinishedPulling="2026-04-17 07:51:56.445034774 +0000 UTC m=+20.232813369" observedRunningTime="2026-04-17 07:51:56.991801276 +0000 UTC m=+20.779579890" watchObservedRunningTime="2026-04-17 07:51:56.991921552 +0000 UTC m=+20.779700154" Apr 17 07:51:57.027585 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.027545 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-42f72" podStartSLOduration=11.064435193 podStartE2EDuration="20.027529127s" podCreationTimestamp="2026-04-17 07:51:37 +0000 UTC" firstStartedPulling="2026-04-17 07:51:39.370714223 +0000 UTC m=+3.158492807" lastFinishedPulling="2026-04-17 07:51:48.333808146 +0000 UTC m=+12.121586741" observedRunningTime="2026-04-17 07:51:57.012171891 +0000 UTC m=+20.799950494" watchObservedRunningTime="2026-04-17 07:51:57.027529127 +0000 UTC m=+20.815307707" Apr 17 07:51:57.028263 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.028245 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5rcmt"] Apr 17 07:51:57.044282 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.044256 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:57.044374 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:57.044348 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5rcmt" podUID="792796b1-b737-4fd3-820e-247120d1de83" Apr 17 07:51:57.079908 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.079864 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qndp2" podStartSLOduration=4.035407629 podStartE2EDuration="21.079849269s" podCreationTimestamp="2026-04-17 07:51:36 +0000 UTC" firstStartedPulling="2026-04-17 07:51:39.372723364 +0000 UTC m=+3.160501944" lastFinishedPulling="2026-04-17 07:51:56.417164991 +0000 UTC m=+20.204943584" observedRunningTime="2026-04-17 07:51:57.079740619 +0000 UTC m=+20.867519223" watchObservedRunningTime="2026-04-17 07:51:57.079849269 +0000 UTC m=+20.867627871" Apr 17 07:51:57.080023 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.079933 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-s2dk9" podStartSLOduration=3.034731463 podStartE2EDuration="20.079929457s" podCreationTimestamp="2026-04-17 07:51:37 +0000 UTC" firstStartedPulling="2026-04-17 07:51:39.372160898 +0000 UTC m=+3.159939485" lastFinishedPulling="2026-04-17 07:51:56.417358884 +0000 UTC m=+20.205137479" observedRunningTime="2026-04-17 07:51:57.044271803 +0000 UTC m=+20.832050406" watchObservedRunningTime="2026-04-17 07:51:57.079929457 +0000 UTC m=+20.867708060" Apr 17 07:51:57.148579 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.148550 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/792796b1-b737-4fd3-820e-247120d1de83-kubelet-config\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:57.148710 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.148652 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/792796b1-b737-4fd3-820e-247120d1de83-dbus\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:57.148775 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.148723 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:57.249728 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.249684 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/792796b1-b737-4fd3-820e-247120d1de83-dbus\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:57.249879 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.249761 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:57.249879 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.249796 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/792796b1-b737-4fd3-820e-247120d1de83-kubelet-config\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:57.249971 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.249875 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/792796b1-b737-4fd3-820e-247120d1de83-kubelet-config\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:57.249971 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.249878 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/792796b1-b737-4fd3-820e-247120d1de83-dbus\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:57.249971 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:57.249953 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:57.250067 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:57.250019 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret podName:792796b1-b737-4fd3-820e-247120d1de83 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:57.75000002 +0000 UTC m=+21.537778617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret") pod "global-pull-secret-syncer-5rcmt" (UID: "792796b1-b737-4fd3-820e-247120d1de83") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:57.698396 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.698367 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 07:51:57.753222 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.753190 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:57.753393 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:57.753296 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:57.753393 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:57.753348 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret podName:792796b1-b737-4fd3-820e-247120d1de83 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:58.753334214 +0000 UTC m=+22.541112796 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret") pod "global-pull-secret-syncer-5rcmt" (UID: "792796b1-b737-4fd3-820e-247120d1de83") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:57.773730 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.773619 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T07:51:57.698388308Z","UUID":"b3943516-3247-459b-89d6-62b02fb53325","Handler":null,"Name":"","Endpoint":""} Apr 17 07:51:57.775403 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.775237 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 07:51:57.775403 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.775408 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 07:51:57.803066 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.803032 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:57.803227 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:57.803122 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:57.906135 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.906098 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" event={"ID":"7ccb8775-fa41-46b2-9f1e-0aa964d80116","Type":"ContainerStarted","Data":"fbea872fbb31fff64458f986c13a5f605e8b3c3779da03349236f9d23542884c"} Apr 17 07:51:57.907384 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.907359 2566 generic.go:358] "Generic (PLEG): container finished" podID="595aad98-ad8c-469e-ae13-798099e8e67b" containerID="1c905a15bd5f84238462f0340e92320a58d3d69e7ddd78fd04252ac8be7844fb" exitCode=0 Apr 17 07:51:57.907488 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.907430 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b24kh" event={"ID":"595aad98-ad8c-469e-ae13-798099e8e67b","Type":"ContainerDied","Data":"1c905a15bd5f84238462f0340e92320a58d3d69e7ddd78fd04252ac8be7844fb"} Apr 17 07:51:57.909857 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.909838 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 07:51:57.910149 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.910128 2566 generic.go:358] "Generic (PLEG): container finished" podID="69d806e6-4ecb-42c4-b3a9-57107400f8d5" containerID="dea6657c9e467c55d89a083b32f01fb62c8e51fc48f1c88142822880cf1afa13" exitCode=1 Apr 17 07:51:57.910237 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.910154 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerDied","Data":"dea6657c9e467c55d89a083b32f01fb62c8e51fc48f1c88142822880cf1afa13"} Apr 17 07:51:57.910237 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.910177 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerStarted","Data":"1d3fafddcab0176a5bcf068d8a53bde7e300a4e74c02b7610690698e38eee2c6"} Apr 17 07:51:57.910237 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.910186 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerStarted","Data":"05474988f9b0af920544bf5fb4f53a3ea0bf142361867cea3914561e96fff058"} Apr 17 07:51:57.910237 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.910194 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerStarted","Data":"c4d81da94b2922107438c88a3a10a9479870a6416d6c951ba89f4552912e1db6"} Apr 17 07:51:57.910237 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.910203 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerStarted","Data":"33c1889295e26b42bd38ff6575e06aedf1b53fcce6382002088f8be8e066bad6"} Apr 17 07:51:57.911550 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.911527 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jbj5m" event={"ID":"af6392d4-cfa0-4d92-a9a8-14dc562724bf","Type":"ContainerStarted","Data":"0cc01d2b5898d79bf4fa8eda7dbe76996ce6c1526693106228ec74bc1ab32c69"} Apr 17 07:51:57.948204 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:57.948159 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jbj5m" podStartSLOduration=4.893673007 podStartE2EDuration="21.948145086s" podCreationTimestamp="2026-04-17 07:51:36 +0000 UTC" firstStartedPulling="2026-04-17 07:51:39.36273958 +0000 UTC m=+3.150518161" lastFinishedPulling="2026-04-17 07:51:56.417211654 +0000 UTC m=+20.204990240" observedRunningTime="2026-04-17 07:51:57.94798087 +0000 UTC m=+21.735759473" watchObservedRunningTime="2026-04-17 07:51:57.948145086 +0000 UTC m=+21.735923689" Apr 17 07:51:58.760901 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:58.760858 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:58.761062 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:58.760968 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:58.761062 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:58.761022 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret podName:792796b1-b737-4fd3-820e-247120d1de83 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:00.761006136 +0000 UTC m=+24.548784721 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret") pod "global-pull-secret-syncer-5rcmt" (UID: "792796b1-b737-4fd3-820e-247120d1de83") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:58.807060 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:58.807026 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:51:58.807237 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:58.807034 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:51:58.807237 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:58.807134 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5rcmt" podUID="792796b1-b737-4fd3-820e-247120d1de83" Apr 17 07:51:58.807237 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:58.807211 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:51:58.915288 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:58.915200 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" event={"ID":"7ccb8775-fa41-46b2-9f1e-0aa964d80116","Type":"ContainerStarted","Data":"e132a329124890b348c1a5273eb391ba330cfe5cc29ee33403cc408a65ed3996"} Apr 17 07:51:58.944297 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:58.944241 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-99ffj" podStartSLOduration=2.745463012 podStartE2EDuration="21.944228235s" podCreationTimestamp="2026-04-17 07:51:37 +0000 UTC" firstStartedPulling="2026-04-17 07:51:39.372374403 +0000 UTC m=+3.160152989" lastFinishedPulling="2026-04-17 07:51:58.571139618 +0000 UTC m=+22.358918212" observedRunningTime="2026-04-17 07:51:58.943901671 +0000 UTC m=+22.731680276" watchObservedRunningTime="2026-04-17 07:51:58.944228235 +0000 UTC m=+22.732006836" Apr 17 07:51:59.227224 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:59.227138 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:59.227867 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:59.227832 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:59.803160 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:59.803123 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:51:59.803334 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:51:59.803245 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:51:59.920491 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:59.920459 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 07:51:59.921021 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:59.920952 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerStarted","Data":"781ba14764a3c26e0b93ad8de3346ecd4327c2a6f8d7275e0cdfb079c120632f"} Apr 17 07:51:59.921472 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:59.921450 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-42f72" Apr 17 07:51:59.921939 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:51:59.921923 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-42f72" Apr 17 07:52:00.773931 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:00.773893 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:00.774109 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:00.774024 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:00.774109 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:00.774080 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret podName:792796b1-b737-4fd3-820e-247120d1de83 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:04.774063818 +0000 UTC m=+28.561842399 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret") pod "global-pull-secret-syncer-5rcmt" (UID: "792796b1-b737-4fd3-820e-247120d1de83") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:00.806422 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:00.806382 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:00.806591 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:00.806383 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:52:00.806591 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:00.806512 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5rcmt" podUID="792796b1-b737-4fd3-820e-247120d1de83" Apr 17 07:52:00.806716 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:00.806625 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:52:01.802779 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:01.802755 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:52:01.803215 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:01.802846 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:52:01.929355 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:01.928980 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 07:52:01.929666 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:01.929615 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerStarted","Data":"ca3cde3b8d9b8b82e93188a6ee1f40e3423baaee626512d98daf5c8da83bff09"} Apr 17 07:52:01.930557 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:01.930112 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:52:01.930557 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:01.930200 2566 scope.go:117] "RemoveContainer" containerID="dea6657c9e467c55d89a083b32f01fb62c8e51fc48f1c88142822880cf1afa13" Apr 17 07:52:01.955334 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:01.954953 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:52:02.805428 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:02.805404 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:02.805831 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:02.805404 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:52:02.805831 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:02.805492 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5rcmt" podUID="792796b1-b737-4fd3-820e-247120d1de83" Apr 17 07:52:02.805831 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:02.805612 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:52:02.933136 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:02.933103 2566 generic.go:358] "Generic (PLEG): container finished" podID="595aad98-ad8c-469e-ae13-798099e8e67b" containerID="618e156ec6af60a372fa195d444b5b44624eb55206b3b770d158c67dd7f14d0b" exitCode=0 Apr 17 07:52:02.933291 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:02.933181 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b24kh" event={"ID":"595aad98-ad8c-469e-ae13-798099e8e67b","Type":"ContainerDied","Data":"618e156ec6af60a372fa195d444b5b44624eb55206b3b770d158c67dd7f14d0b"} Apr 17 07:52:02.936554 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:02.936503 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 07:52:02.936854 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:02.936832 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" event={"ID":"69d806e6-4ecb-42c4-b3a9-57107400f8d5","Type":"ContainerStarted","Data":"2de0641f64ac082d1382d469c5bc4467910300ab45083bea186312d5dd14325f"} Apr 17 07:52:02.937112 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:02.937097 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:52:02.937195 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:02.937114 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:52:02.951691 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:02.951667 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:52:02.987212 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:02.987166 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" podStartSLOduration=8.858589088 podStartE2EDuration="25.987153402s" podCreationTimestamp="2026-04-17 07:51:37 +0000 UTC" firstStartedPulling="2026-04-17 07:51:39.368569769 +0000 UTC m=+3.156348352" lastFinishedPulling="2026-04-17 07:51:56.497134074 +0000 UTC m=+20.284912666" observedRunningTime="2026-04-17 07:52:02.98605928 +0000 UTC m=+26.773837882" watchObservedRunningTime="2026-04-17 07:52:02.987153402 +0000 UTC m=+26.774931983" Apr 17 07:52:03.803140 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:03.802953 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:52:03.803140 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:03.803085 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:52:03.808157 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:03.808126 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jt4rj"] Apr 17 07:52:03.808494 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:03.808240 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:52:03.808494 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:03.808323 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:52:03.811036 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:03.810985 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5rcmt"] Apr 17 07:52:03.811158 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:03.811072 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:03.811208 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:03.811165 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5rcmt" podUID="792796b1-b737-4fd3-820e-247120d1de83" Apr 17 07:52:03.811605 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:03.811587 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vbl24"] Apr 17 07:52:03.940388 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:03.940301 2566 generic.go:358] "Generic (PLEG): container finished" podID="595aad98-ad8c-469e-ae13-798099e8e67b" containerID="ab46f2df066d241f95dfda323cc7a1cac63b2ba1c85f7382455239f6a7899a5b" exitCode=0 Apr 17 07:52:03.940535 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:03.940389 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b24kh" event={"ID":"595aad98-ad8c-469e-ae13-798099e8e67b","Type":"ContainerDied","Data":"ab46f2df066d241f95dfda323cc7a1cac63b2ba1c85f7382455239f6a7899a5b"} Apr 17 07:52:03.940535 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:03.940453 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:52:03.940655 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:03.940634 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:52:04.805893 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:04.805867 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:04.806034 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:04.806003 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:04.806069 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:04.806062 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret podName:792796b1-b737-4fd3-820e-247120d1de83 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:12.806044442 +0000 UTC m=+36.593823036 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret") pod "global-pull-secret-syncer-5rcmt" (UID: "792796b1-b737-4fd3-820e-247120d1de83") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:04.944525 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:04.944494 2566 generic.go:358] "Generic (PLEG): container finished" podID="595aad98-ad8c-469e-ae13-798099e8e67b" containerID="0c9e8879f5d699e8a04035717487e55ae50705dd2cf4e1600fa586bb05f92a80" exitCode=0 Apr 17 07:52:04.944916 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:04.944586 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b24kh" event={"ID":"595aad98-ad8c-469e-ae13-798099e8e67b","Type":"ContainerDied","Data":"0c9e8879f5d699e8a04035717487e55ae50705dd2cf4e1600fa586bb05f92a80"} Apr 17 07:52:05.803138 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:05.803101 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:05.803306 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:05.803101 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:52:05.803306 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:05.803235 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5rcmt" podUID="792796b1-b737-4fd3-820e-247120d1de83" Apr 17 07:52:05.803306 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:05.803289 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:52:05.803444 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:05.803407 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:52:05.803507 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:05.803488 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:52:07.803275 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:07.803233 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:07.803275 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:07.803269 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:52:07.804126 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:07.803233 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:52:07.804126 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:07.803357 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5rcmt" podUID="792796b1-b737-4fd3-820e-247120d1de83" Apr 17 07:52:07.804126 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:07.803457 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:52:07.804126 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:07.803537 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vbl24" podUID="648c562e-66a8-4487-995e-2e06a13a92a5" Apr 17 07:52:09.525207 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.525129 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeReady" Apr 17 07:52:09.525685 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.525284 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 07:52:09.556458 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.556426 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-68954cc549-2czzq"] Apr 17 07:52:09.560496 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.560470 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.563115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.563088 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 07:52:09.563252 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.563143 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 07:52:09.563252 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.563239 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 07:52:09.563521 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.563492 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-njxn8\"" Apr 17 07:52:09.569138 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.569112 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 07:52:09.570549 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.570524 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rltvj"] Apr 17 07:52:09.574366 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.574346 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.575814 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.575788 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68954cc549-2czzq"] Apr 17 07:52:09.576654 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.576488 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qvwww"] Apr 17 07:52:09.577201 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.577182 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xs2k6\"" Apr 17 07:52:09.577361 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.577341 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 07:52:09.577556 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.577537 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 07:52:09.579502 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.579484 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:09.583209 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.582678 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 07:52:09.583209 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.582929 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wmmwx\"" Apr 17 07:52:09.583209 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.582954 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 07:52:09.583209 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.583122 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 07:52:09.583440 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.583348 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rltvj"] Apr 17 07:52:09.591854 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.591829 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qvwww"] Apr 17 07:52:09.646602 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646563 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-certificates\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.646602 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646605 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-bound-sa-token\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.646877 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646660 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-image-registry-private-configuration\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.646877 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646723 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dlk5\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-kube-api-access-8dlk5\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.646877 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646755 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b863c142-c069-46ab-9031-2f50beeb3f53-tmp-dir\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.646877 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646781 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95csq\" (UniqueName: \"kubernetes.io/projected/2354a35f-6d75-4e6e-a614-0e68c4002cb7-kube-api-access-95csq\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:09.647063 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646865 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.647063 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646912 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b863c142-c069-46ab-9031-2f50beeb3f53-config-volume\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.647063 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646933 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.647063 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646961 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrmr\" (UniqueName: \"kubernetes.io/projected/b863c142-c069-46ab-9031-2f50beeb3f53-kube-api-access-ngrmr\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.647063 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646978 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78985eab-173e-4c9e-82d1-2bc5d78fa58f-ca-trust-extracted\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.647063 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.646993 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-installation-pull-secrets\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.647548 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.647080 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:09.647548 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.647114 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-trusted-ca\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.747903 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.747864 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:09.747903 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.747909 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-trusted-ca\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.748157 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.747945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-certificates\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.748157 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.747969 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-bound-sa-token\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.748157 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:09.748005 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:09.748157 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:09.748080 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert podName:2354a35f-6d75-4e6e-a614-0e68c4002cb7 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:10.248058041 +0000 UTC m=+34.035836636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert") pod "ingress-canary-qvwww" (UID: "2354a35f-6d75-4e6e-a614-0e68c4002cb7") : secret "canary-serving-cert" not found Apr 17 07:52:09.748360 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748010 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-image-registry-private-configuration\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.748360 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748288 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dlk5\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-kube-api-access-8dlk5\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.748360 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748330 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b863c142-c069-46ab-9031-2f50beeb3f53-tmp-dir\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.748509 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748361 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95csq\" (UniqueName: \"kubernetes.io/projected/2354a35f-6d75-4e6e-a614-0e68c4002cb7-kube-api-access-95csq\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:09.748509 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748390 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.748509 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748418 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b863c142-c069-46ab-9031-2f50beeb3f53-config-volume\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.748509 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748442 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.748509 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748479 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrmr\" (UniqueName: \"kubernetes.io/projected/b863c142-c069-46ab-9031-2f50beeb3f53-kube-api-access-ngrmr\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.748509 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748504 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78985eab-173e-4c9e-82d1-2bc5d78fa58f-ca-trust-extracted\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.748821 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748530 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-installation-pull-secrets\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.748821 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748622 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-certificates\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.748907 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.748885 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-trusted-ca\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.749000 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:09.748970 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:09.749000 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:09.748989 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68954cc549-2czzq: secret "image-registry-tls" not found Apr 17 07:52:09.749000 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:09.749000 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:09.749228 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:09.749035 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls podName:78985eab-173e-4c9e-82d1-2bc5d78fa58f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:10.24901944 +0000 UTC m=+34.036798020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls") pod "image-registry-68954cc549-2czzq" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f") : secret "image-registry-tls" not found Apr 17 07:52:09.749228 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:09.749051 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls podName:b863c142-c069-46ab-9031-2f50beeb3f53 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:10.249043512 +0000 UTC m=+34.036822093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls") pod "dns-default-rltvj" (UID: "b863c142-c069-46ab-9031-2f50beeb3f53") : secret "dns-default-metrics-tls" not found Apr 17 07:52:09.749228 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.749074 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b863c142-c069-46ab-9031-2f50beeb3f53-tmp-dir\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.749228 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.749090 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b863c142-c069-46ab-9031-2f50beeb3f53-config-volume\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.749453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.749342 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78985eab-173e-4c9e-82d1-2bc5d78fa58f-ca-trust-extracted\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.753044 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.753021 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-image-registry-private-configuration\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.753162 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.753052 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-installation-pull-secrets\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.761825 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.761800 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-bound-sa-token\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.762013 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.761980 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dlk5\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-kube-api-access-8dlk5\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:09.762090 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.761984 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95csq\" (UniqueName: \"kubernetes.io/projected/2354a35f-6d75-4e6e-a614-0e68c4002cb7-kube-api-access-95csq\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:09.763341 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.763312 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrmr\" (UniqueName: \"kubernetes.io/projected/b863c142-c069-46ab-9031-2f50beeb3f53-kube-api-access-ngrmr\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:09.803509 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.803471 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:52:09.803714 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.803471 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:52:09.803714 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.803471 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:09.806162 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.806052 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:09.806162 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.806138 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 07:52:09.806162 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.806165 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ctqdx\"" Apr 17 07:52:09.806761 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.806737 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z4m56\"" Apr 17 07:52:09.806761 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.806750 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:09.806761 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:09.806763 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:10.252555 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.252466 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:10.252555 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.252516 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:10.252555 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.252553 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:10.252847 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:10.252638 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:10.252847 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:10.252666 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68954cc549-2czzq: secret "image-registry-tls" not found Apr 17 07:52:10.252847 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:10.252678 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:10.252847 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:10.252719 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:10.252847 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:10.252759 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls podName:78985eab-173e-4c9e-82d1-2bc5d78fa58f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:11.252740128 +0000 UTC m=+35.040518724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls") pod "image-registry-68954cc549-2czzq" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f") : secret "image-registry-tls" not found Apr 17 07:52:10.252847 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:10.252775 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert podName:2354a35f-6d75-4e6e-a614-0e68c4002cb7 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:11.252768832 +0000 UTC m=+35.040547413 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert") pod "ingress-canary-qvwww" (UID: "2354a35f-6d75-4e6e-a614-0e68c4002cb7") : secret "canary-serving-cert" not found Apr 17 07:52:10.252847 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:10.252789 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls podName:b863c142-c069-46ab-9031-2f50beeb3f53 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:11.252781627 +0000 UTC m=+35.040560214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls") pod "dns-default-rltvj" (UID: "b863c142-c069-46ab-9031-2f50beeb3f53") : secret "dns-default-metrics-tls" not found Apr 17 07:52:10.454741 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.454690 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:52:10.454871 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:10.454842 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:52:10.454917 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:10.454904 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs podName:4fbea707-d5c2-4c45-82e5-089d272aa922 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:42.45488822 +0000 UTC m=+66.242666805 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs") pod "network-metrics-daemon-jt4rj" (UID: "4fbea707-d5c2-4c45-82e5-089d272aa922") : secret "metrics-daemon-secret" not found Apr 17 07:52:10.656309 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.656276 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhcnb\" (UniqueName: \"kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb\") pod \"network-check-target-vbl24\" (UID: \"648c562e-66a8-4487-995e-2e06a13a92a5\") " pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:52:10.670376 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.670344 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhcnb\" (UniqueName: \"kubernetes.io/projected/648c562e-66a8-4487-995e-2e06a13a92a5-kube-api-access-vhcnb\") pod \"network-check-target-vbl24\" (UID: \"648c562e-66a8-4487-995e-2e06a13a92a5\") " pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:52:10.722195 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.722171 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:52:10.874226 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.874195 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vbl24"] Apr 17 07:52:10.920519 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:52:10.920483 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod648c562e_66a8_4487_995e_2e06a13a92a5.slice/crio-9eedc34c5b9c3f765cbf3a2213726b73662dd85d30077755023071671f31e29d WatchSource:0}: Error finding container 9eedc34c5b9c3f765cbf3a2213726b73662dd85d30077755023071671f31e29d: Status 404 returned error can't find the container with id 9eedc34c5b9c3f765cbf3a2213726b73662dd85d30077755023071671f31e29d Apr 17 07:52:10.959233 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.959202 2566 generic.go:358] "Generic (PLEG): container finished" podID="595aad98-ad8c-469e-ae13-798099e8e67b" containerID="9572f9209c3fc3de04991d75666e7a3a4fe551f155d07cb7b79de5dbdf2037c6" exitCode=0 Apr 17 07:52:10.959404 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.959272 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b24kh" event={"ID":"595aad98-ad8c-469e-ae13-798099e8e67b","Type":"ContainerDied","Data":"9572f9209c3fc3de04991d75666e7a3a4fe551f155d07cb7b79de5dbdf2037c6"} Apr 17 07:52:10.960295 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:10.960273 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vbl24" event={"ID":"648c562e-66a8-4487-995e-2e06a13a92a5","Type":"ContainerStarted","Data":"9eedc34c5b9c3f765cbf3a2213726b73662dd85d30077755023071671f31e29d"} Apr 17 07:52:11.261219 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:11.261179 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:11.261219 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:11.261221 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:11.261409 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:11.261248 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:11.261409 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:11.261336 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:11.261409 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:11.261343 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:11.261409 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:11.261382 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert podName:2354a35f-6d75-4e6e-a614-0e68c4002cb7 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:13.261368974 +0000 UTC m=+37.049147555 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert") pod "ingress-canary-qvwww" (UID: "2354a35f-6d75-4e6e-a614-0e68c4002cb7") : secret "canary-serving-cert" not found Apr 17 07:52:11.261537 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:11.261411 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls podName:b863c142-c069-46ab-9031-2f50beeb3f53 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:13.261393458 +0000 UTC m=+37.049172060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls") pod "dns-default-rltvj" (UID: "b863c142-c069-46ab-9031-2f50beeb3f53") : secret "dns-default-metrics-tls" not found Apr 17 07:52:11.261537 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:11.261443 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:11.261537 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:11.261460 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68954cc549-2czzq: secret "image-registry-tls" not found Apr 17 07:52:11.261537 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:11.261499 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls podName:78985eab-173e-4c9e-82d1-2bc5d78fa58f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:13.261488221 +0000 UTC m=+37.049266806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls") pod "image-registry-68954cc549-2czzq" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f") : secret "image-registry-tls" not found Apr 17 07:52:11.964491 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:11.964459 2566 generic.go:358] "Generic (PLEG): container finished" podID="595aad98-ad8c-469e-ae13-798099e8e67b" containerID="19de2525fa208e2f4ba3d225f3d4b697f581b22c5974af5860b6fa59409c957e" exitCode=0 Apr 17 07:52:11.964864 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:11.964525 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b24kh" event={"ID":"595aad98-ad8c-469e-ae13-798099e8e67b","Type":"ContainerDied","Data":"19de2525fa208e2f4ba3d225f3d4b697f581b22c5974af5860b6fa59409c957e"} Apr 17 07:52:12.875650 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:12.875420 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:12.880396 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:12.880345 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/792796b1-b737-4fd3-820e-247120d1de83-original-pull-secret\") pod \"global-pull-secret-syncer-5rcmt\" (UID: \"792796b1-b737-4fd3-820e-247120d1de83\") " pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:12.969256 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:12.969225 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b24kh" event={"ID":"595aad98-ad8c-469e-ae13-798099e8e67b","Type":"ContainerStarted","Data":"564baf8dad4d030cc7a8571210daa82bcb74f22ec318f492e0b0145d4501f25e"} Apr 17 07:52:12.997309 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:12.997248 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b24kh" podStartSLOduration=5.763614372 podStartE2EDuration="36.997233432s" podCreationTimestamp="2026-04-17 07:51:36 +0000 UTC" firstStartedPulling="2026-04-17 07:51:39.365769668 +0000 UTC m=+3.153548263" lastFinishedPulling="2026-04-17 07:52:10.599388742 +0000 UTC m=+34.387167323" observedRunningTime="2026-04-17 07:52:12.996903019 +0000 UTC m=+36.784681646" watchObservedRunningTime="2026-04-17 07:52:12.997233432 +0000 UTC m=+36.785012035" Apr 17 07:52:13.127756 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:13.127658 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5rcmt" Apr 17 07:52:13.253050 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:13.253019 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5rcmt"] Apr 17 07:52:13.256204 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:52:13.256172 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod792796b1_b737_4fd3_820e_247120d1de83.slice/crio-4928eb59e1185d735c700958f13bd53041bb275bd335b4fdbd7ff01c4e06eabb WatchSource:0}: Error finding container 4928eb59e1185d735c700958f13bd53041bb275bd335b4fdbd7ff01c4e06eabb: Status 404 returned error can't find the container with id 4928eb59e1185d735c700958f13bd53041bb275bd335b4fdbd7ff01c4e06eabb Apr 17 07:52:13.279304 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:13.279283 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:13.279407 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:13.279336 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:13.279407 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:13.279359 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:13.279478 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:13.279430 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:13.279511 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:13.279483 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:13.279511 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:13.279496 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert podName:2354a35f-6d75-4e6e-a614-0e68c4002cb7 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:17.279476632 +0000 UTC m=+41.067255215 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert") pod "ingress-canary-qvwww" (UID: "2354a35f-6d75-4e6e-a614-0e68c4002cb7") : secret "canary-serving-cert" not found Apr 17 07:52:13.279511 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:13.279497 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68954cc549-2czzq: secret "image-registry-tls" not found Apr 17 07:52:13.279604 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:13.279437 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:13.279604 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:13.279533 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls podName:78985eab-173e-4c9e-82d1-2bc5d78fa58f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:17.279523445 +0000 UTC m=+41.067302031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls") pod "image-registry-68954cc549-2czzq" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f") : secret "image-registry-tls" not found Apr 17 07:52:13.279604 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:13.279556 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls podName:b863c142-c069-46ab-9031-2f50beeb3f53 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:17.279542209 +0000 UTC m=+41.067320792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls") pod "dns-default-rltvj" (UID: "b863c142-c069-46ab-9031-2f50beeb3f53") : secret "dns-default-metrics-tls" not found Apr 17 07:52:13.975010 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:13.974971 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5rcmt" event={"ID":"792796b1-b737-4fd3-820e-247120d1de83","Type":"ContainerStarted","Data":"4928eb59e1185d735c700958f13bd53041bb275bd335b4fdbd7ff01c4e06eabb"} Apr 17 07:52:14.978225 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:14.978185 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vbl24" event={"ID":"648c562e-66a8-4487-995e-2e06a13a92a5","Type":"ContainerStarted","Data":"3e340ec16531eabb8a9a02982fbb24a285d39147f8b60a682094d6a707a526b3"} Apr 17 07:52:14.978847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:14.978338 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:52:14.993098 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:14.993056 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vbl24" podStartSLOduration=34.38640482 podStartE2EDuration="37.993041389s" podCreationTimestamp="2026-04-17 07:51:37 +0000 UTC" firstStartedPulling="2026-04-17 07:52:10.922459366 +0000 UTC m=+34.710237946" lastFinishedPulling="2026-04-17 07:52:14.529095933 +0000 UTC m=+38.316874515" observedRunningTime="2026-04-17 07:52:14.992151308 +0000 UTC m=+38.779929912" watchObservedRunningTime="2026-04-17 07:52:14.993041389 +0000 UTC m=+38.780820015" Apr 17 07:52:17.312056 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:17.312016 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:17.312523 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:17.312071 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:17.312523 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:17.312122 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:17.312523 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:17.312195 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:17.312523 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:17.312222 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:17.312523 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:17.312200 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:17.312523 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:17.312268 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls podName:b863c142-c069-46ab-9031-2f50beeb3f53 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:25.312247874 +0000 UTC m=+49.100026458 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls") pod "dns-default-rltvj" (UID: "b863c142-c069-46ab-9031-2f50beeb3f53") : secret "dns-default-metrics-tls" not found Apr 17 07:52:17.312523 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:17.312302 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert podName:2354a35f-6d75-4e6e-a614-0e68c4002cb7 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:25.312291693 +0000 UTC m=+49.100070274 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert") pod "ingress-canary-qvwww" (UID: "2354a35f-6d75-4e6e-a614-0e68c4002cb7") : secret "canary-serving-cert" not found Apr 17 07:52:17.312523 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:17.312275 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68954cc549-2czzq: secret "image-registry-tls" not found Apr 17 07:52:17.312523 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:17.312344 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls podName:78985eab-173e-4c9e-82d1-2bc5d78fa58f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:25.312338229 +0000 UTC m=+49.100116809 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls") pod "image-registry-68954cc549-2czzq" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f") : secret "image-registry-tls" not found Apr 17 07:52:18.987885 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:18.987794 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5rcmt" event={"ID":"792796b1-b737-4fd3-820e-247120d1de83","Type":"ContainerStarted","Data":"e86fe1fc628a8c5cc0d9524de6d4c8e95a2e45ed8a8e7c2e27a6b9c12fb9d3ff"} Apr 17 07:52:19.002485 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:19.002437 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5rcmt" podStartSLOduration=16.678795391 podStartE2EDuration="22.002423298s" podCreationTimestamp="2026-04-17 07:51:57 +0000 UTC" firstStartedPulling="2026-04-17 07:52:13.25800578 +0000 UTC m=+37.045784361" lastFinishedPulling="2026-04-17 07:52:18.581633686 +0000 UTC m=+42.369412268" observedRunningTime="2026-04-17 07:52:19.001726402 +0000 UTC m=+42.789505006" watchObservedRunningTime="2026-04-17 07:52:19.002423298 +0000 UTC m=+42.790201900" Apr 17 07:52:25.375564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:25.375520 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:25.375564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:25.375564 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:25.376155 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:25.375673 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:25.376155 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:25.375708 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68954cc549-2czzq: secret "image-registry-tls" not found Apr 17 07:52:25.376155 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:25.375734 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:25.376155 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:25.375762 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls podName:78985eab-173e-4c9e-82d1-2bc5d78fa58f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:41.375745723 +0000 UTC m=+65.163524322 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls") pod "image-registry-68954cc549-2czzq" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f") : secret "image-registry-tls" not found Apr 17 07:52:25.376155 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:25.375776 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:25.376155 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:25.375817 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls podName:b863c142-c069-46ab-9031-2f50beeb3f53 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:41.375806351 +0000 UTC m=+65.163584933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls") pod "dns-default-rltvj" (UID: "b863c142-c069-46ab-9031-2f50beeb3f53") : secret "dns-default-metrics-tls" not found Apr 17 07:52:25.376155 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:25.375879 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:25.376155 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:25.375935 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert podName:2354a35f-6d75-4e6e-a614-0e68c4002cb7 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:41.375921609 +0000 UTC m=+65.163700194 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert") pod "ingress-canary-qvwww" (UID: "2354a35f-6d75-4e6e-a614-0e68c4002cb7") : secret "canary-serving-cert" not found Apr 17 07:52:34.955896 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:34.955863 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6vls" Apr 17 07:52:41.390468 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:41.390406 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:52:41.390468 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:41.390471 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:52:41.391001 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:41.390513 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:52:41.391001 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:41.390551 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:52:41.391001 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:41.390573 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68954cc549-2czzq: secret "image-registry-tls" not found Apr 17 07:52:41.391001 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:41.390617 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:41.391001 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:41.390638 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls podName:78985eab-173e-4c9e-82d1-2bc5d78fa58f nodeName:}" failed. No retries permitted until 2026-04-17 07:53:13.390623139 +0000 UTC m=+97.178401720 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls") pod "image-registry-68954cc549-2czzq" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f") : secret "image-registry-tls" not found Apr 17 07:52:41.391001 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:41.390675 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls podName:b863c142-c069-46ab-9031-2f50beeb3f53 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:13.390661681 +0000 UTC m=+97.178440262 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls") pod "dns-default-rltvj" (UID: "b863c142-c069-46ab-9031-2f50beeb3f53") : secret "dns-default-metrics-tls" not found Apr 17 07:52:41.391001 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:41.390619 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:41.391001 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:41.390724 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert podName:2354a35f-6d75-4e6e-a614-0e68c4002cb7 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:13.390714688 +0000 UTC m=+97.178493276 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert") pod "ingress-canary-qvwww" (UID: "2354a35f-6d75-4e6e-a614-0e68c4002cb7") : secret "canary-serving-cert" not found Apr 17 07:52:42.499109 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:42.499072 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:52:42.499653 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:42.499256 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:52:42.499653 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:52:42.499357 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs podName:4fbea707-d5c2-4c45-82e5-089d272aa922 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:46.499335645 +0000 UTC m=+130.287114229 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs") pod "network-metrics-daemon-jt4rj" (UID: "4fbea707-d5c2-4c45-82e5-089d272aa922") : secret "metrics-daemon-secret" not found Apr 17 07:52:45.982546 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:52:45.982432 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vbl24" Apr 17 07:53:13.409585 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:13.409533 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:53:13.409585 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:13.409591 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:53:13.410079 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:13.409619 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:53:13.410079 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:13.409725 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:13.410079 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:13.409738 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:53:13.410079 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:13.409751 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:13.410079 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:13.409764 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68954cc549-2czzq: secret "image-registry-tls" not found Apr 17 07:53:13.410079 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:13.409781 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert podName:2354a35f-6d75-4e6e-a614-0e68c4002cb7 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:17.409766048 +0000 UTC m=+161.197544629 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert") pod "ingress-canary-qvwww" (UID: "2354a35f-6d75-4e6e-a614-0e68c4002cb7") : secret "canary-serving-cert" not found Apr 17 07:53:13.410079 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:13.409810 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls podName:b863c142-c069-46ab-9031-2f50beeb3f53 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:17.409796829 +0000 UTC m=+161.197575410 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls") pod "dns-default-rltvj" (UID: "b863c142-c069-46ab-9031-2f50beeb3f53") : secret "dns-default-metrics-tls" not found Apr 17 07:53:13.410079 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:13.409824 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls podName:78985eab-173e-4c9e-82d1-2bc5d78fa58f nodeName:}" failed. No retries permitted until 2026-04-17 07:54:17.409818305 +0000 UTC m=+161.197596886 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls") pod "image-registry-68954cc549-2czzq" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f") : secret "image-registry-tls" not found Apr 17 07:53:46.561209 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:46.561137 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:53:46.561862 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:46.561348 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:53:46.563721 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:46.561968 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs podName:4fbea707-d5c2-4c45-82e5-089d272aa922 nodeName:}" failed. No retries permitted until 2026-04-17 07:55:48.561435965 +0000 UTC m=+252.349214560 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs") pod "network-metrics-daemon-jt4rj" (UID: "4fbea707-d5c2-4c45-82e5-089d272aa922") : secret "metrics-daemon-secret" not found Apr 17 07:53:58.105871 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.105834 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28"] Apr 17 07:53:58.108486 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.108469 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28" Apr 17 07:53:58.111665 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.111644 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:58.111976 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.111959 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 07:53:58.112555 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.112542 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-f4pm5\"" Apr 17 07:53:58.118732 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.118709 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28"] Apr 17 07:53:58.218338 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.218308 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2"] Apr 17 07:53:58.221100 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.221077 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq"] Apr 17 07:53:58.221240 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.221224 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:53:58.224538 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.224509 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:58.225752 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.225728 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 07:53:58.225859 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.225781 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 07:53:58.225859 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.225795 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-9dg2p\"" Apr 17 07:53:58.226225 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.226210 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-kqzwd"] Apr 17 07:53:58.226376 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.226358 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.228789 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.228774 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2blbh"] Apr 17 07:53:58.228963 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.228942 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.229082 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.229059 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:58.229258 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.229237 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-ls78l\"" Apr 17 07:53:58.229407 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.229318 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 07:53:58.229566 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.229437 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 07:53:58.229652 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.229572 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 07:53:58.231986 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.231967 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.232235 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.232213 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 07:53:58.232327 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.232285 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 07:53:58.232647 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.232631 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 07:53:58.232939 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.232925 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 07:53:58.233095 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.233081 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-wkh6f\"" Apr 17 07:53:58.237858 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.237836 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn5jk\" (UniqueName: \"kubernetes.io/projected/2d59fab7-6249-4bfb-844f-8cdca44acef2-kube-api-access-wn5jk\") pod \"volume-data-source-validator-7c6cbb6c87-jsm28\" (UID: \"2d59fab7-6249-4bfb-844f-8cdca44acef2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28" Apr 17 07:53:58.238621 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.238569 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 07:53:58.238621 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.237932 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 07:53:58.238898 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.238108 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 07:53:58.238952 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.238523 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-d76p2\"" Apr 17 07:53:58.239067 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.238030 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:58.240416 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.240396 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2"] Apr 17 07:53:58.241303 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.241284 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 07:53:58.241543 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.241529 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2blbh"] Apr 17 07:53:58.242210 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.242191 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-kqzwd"] Apr 17 07:53:58.243595 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.243573 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 07:53:58.249922 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.249904 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq"] Apr 17 07:53:58.326482 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.326446 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-78488888b4-7t88f"] Apr 17 07:53:58.329570 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.329549 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.331983 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.331963 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 07:53:58.332215 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.332195 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 07:53:58.332215 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.332211 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 07:53:58.332348 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.332211 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kv5rf\"" Apr 17 07:53:58.332525 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.332506 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 07:53:58.332575 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.332538 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 07:53:58.332575 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.332512 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 07:53:58.339500 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn5jk\" (UniqueName: \"kubernetes.io/projected/2d59fab7-6249-4bfb-844f-8cdca44acef2-kube-api-access-wn5jk\") pod \"volume-data-source-validator-7c6cbb6c87-jsm28\" (UID: \"2d59fab7-6249-4bfb-844f-8cdca44acef2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28" Apr 17 07:53:58.339615 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339524 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c75543-1a40-464a-bce1-5fe690add66f-serving-cert\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.339615 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339555 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ef651a-e48e-4389-a551-48cc6423bcd0-config\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.339615 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339596 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5648bed-82b5-4b80-8d08-2e781e7705fc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxszq\" (UID: \"f5648bed-82b5-4b80-8d08-2e781e7705fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.339785 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339648 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86c75543-1a40-464a-bce1-5fe690add66f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.339785 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339675 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5648bed-82b5-4b80-8d08-2e781e7705fc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxszq\" (UID: \"f5648bed-82b5-4b80-8d08-2e781e7705fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.339785 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339733 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26ph\" (UniqueName: \"kubernetes.io/projected/014f2810-c514-4886-84b9-38eb14825ce3-kube-api-access-p26ph\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:53:58.339785 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339782 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86c75543-1a40-464a-bce1-5fe690add66f-tmp\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.339973 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339809 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqwfz\" (UniqueName: \"kubernetes.io/projected/69ef651a-e48e-4389-a551-48cc6423bcd0-kube-api-access-vqwfz\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.339973 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339841 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79lpr\" (UniqueName: \"kubernetes.io/projected/f5648bed-82b5-4b80-8d08-2e781e7705fc-kube-api-access-79lpr\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxszq\" (UID: \"f5648bed-82b5-4b80-8d08-2e781e7705fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.339973 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339882 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86c75543-1a40-464a-bce1-5fe690add66f-service-ca-bundle\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.339973 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339907 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mjjx\" (UniqueName: \"kubernetes.io/projected/86c75543-1a40-464a-bce1-5fe690add66f-kube-api-access-5mjjx\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.339973 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339936 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:53:58.340269 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339974 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69ef651a-e48e-4389-a551-48cc6423bcd0-serving-cert\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.340269 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.339996 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69ef651a-e48e-4389-a551-48cc6423bcd0-trusted-ca\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.340269 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.340021 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/86c75543-1a40-464a-bce1-5fe690add66f-snapshots\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.340587 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.340571 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-78488888b4-7t88f"] Apr 17 07:53:58.350100 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.350078 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn5jk\" (UniqueName: \"kubernetes.io/projected/2d59fab7-6249-4bfb-844f-8cdca44acef2-kube-api-access-wn5jk\") pod \"volume-data-source-validator-7c6cbb6c87-jsm28\" (UID: \"2d59fab7-6249-4bfb-844f-8cdca44acef2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28" Apr 17 07:53:58.416763 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.416653 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28" Apr 17 07:53:58.440690 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.440656 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86c75543-1a40-464a-bce1-5fe690add66f-service-ca-bundle\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.440690 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.440714 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mjjx\" (UniqueName: \"kubernetes.io/projected/86c75543-1a40-464a-bce1-5fe690add66f-kube-api-access-5mjjx\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.440933 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.440746 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:53:58.440933 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.440777 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncnr4\" (UniqueName: \"kubernetes.io/projected/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-kube-api-access-ncnr4\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.440933 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.440803 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-default-certificate\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.440933 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.440826 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-stats-auth\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.440933 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.440912 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69ef651a-e48e-4389-a551-48cc6423bcd0-serving-cert\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.441150 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.440950 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69ef651a-e48e-4389-a551-48cc6423bcd0-trusted-ca\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.441150 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:58.440917 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:53:58.441150 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.440982 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/86c75543-1a40-464a-bce1-5fe690add66f-snapshots\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.441150 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:58.441039 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls podName:014f2810-c514-4886-84b9-38eb14825ce3 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:58.941015263 +0000 UTC m=+142.728793860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2f5c2" (UID: "014f2810-c514-4886-84b9-38eb14825ce3") : secret "samples-operator-tls" not found Apr 17 07:53:58.441340 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441214 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.441340 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441281 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c75543-1a40-464a-bce1-5fe690add66f-serving-cert\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.441340 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441289 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86c75543-1a40-464a-bce1-5fe690add66f-service-ca-bundle\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.441474 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441352 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ef651a-e48e-4389-a551-48cc6423bcd0-config\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.441474 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441403 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5648bed-82b5-4b80-8d08-2e781e7705fc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxszq\" (UID: \"f5648bed-82b5-4b80-8d08-2e781e7705fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.441474 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441442 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86c75543-1a40-464a-bce1-5fe690add66f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.441474 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441464 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5648bed-82b5-4b80-8d08-2e781e7705fc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxszq\" (UID: \"f5648bed-82b5-4b80-8d08-2e781e7705fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.441670 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441501 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p26ph\" (UniqueName: \"kubernetes.io/projected/014f2810-c514-4886-84b9-38eb14825ce3-kube-api-access-p26ph\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:53:58.441670 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441541 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.441670 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441571 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86c75543-1a40-464a-bce1-5fe690add66f-tmp\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.441670 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441610 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/86c75543-1a40-464a-bce1-5fe690add66f-snapshots\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.441670 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441614 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqwfz\" (UniqueName: \"kubernetes.io/projected/69ef651a-e48e-4389-a551-48cc6423bcd0-kube-api-access-vqwfz\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.441670 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441658 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79lpr\" (UniqueName: \"kubernetes.io/projected/f5648bed-82b5-4b80-8d08-2e781e7705fc-kube-api-access-79lpr\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxszq\" (UID: \"f5648bed-82b5-4b80-8d08-2e781e7705fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.441997 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.441984 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69ef651a-e48e-4389-a551-48cc6423bcd0-trusted-ca\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.442050 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.442018 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ef651a-e48e-4389-a551-48cc6423bcd0-config\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.442288 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.442265 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86c75543-1a40-464a-bce1-5fe690add66f-tmp\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.442504 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.442479 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5648bed-82b5-4b80-8d08-2e781e7705fc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxszq\" (UID: \"f5648bed-82b5-4b80-8d08-2e781e7705fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.443221 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.443196 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86c75543-1a40-464a-bce1-5fe690add66f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.444016 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.443979 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c75543-1a40-464a-bce1-5fe690add66f-serving-cert\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.444226 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.444205 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5648bed-82b5-4b80-8d08-2e781e7705fc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxszq\" (UID: \"f5648bed-82b5-4b80-8d08-2e781e7705fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.444332 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.444316 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69ef651a-e48e-4389-a551-48cc6423bcd0-serving-cert\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.456744 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.452777 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mjjx\" (UniqueName: \"kubernetes.io/projected/86c75543-1a40-464a-bce1-5fe690add66f-kube-api-access-5mjjx\") pod \"insights-operator-585dfdc468-kqzwd\" (UID: \"86c75543-1a40-464a-bce1-5fe690add66f\") " pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.456744 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.454413 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79lpr\" (UniqueName: \"kubernetes.io/projected/f5648bed-82b5-4b80-8d08-2e781e7705fc-kube-api-access-79lpr\") pod \"kube-storage-version-migrator-operator-6769c5d45-dxszq\" (UID: \"f5648bed-82b5-4b80-8d08-2e781e7705fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.457555 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.457470 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqwfz\" (UniqueName: \"kubernetes.io/projected/69ef651a-e48e-4389-a551-48cc6423bcd0-kube-api-access-vqwfz\") pod \"console-operator-9d4b6777b-2blbh\" (UID: \"69ef651a-e48e-4389-a551-48cc6423bcd0\") " pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.458612 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.458589 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26ph\" (UniqueName: \"kubernetes.io/projected/014f2810-c514-4886-84b9-38eb14825ce3-kube-api-access-p26ph\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:53:58.536249 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.536214 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28"] Apr 17 07:53:58.539224 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:53:58.539199 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d59fab7_6249_4bfb_844f_8cdca44acef2.slice/crio-92374db667b64b4f0b6ae752666d7d701ca9411acf474fc0c55b71e9f363b13c WatchSource:0}: Error finding container 92374db667b64b4f0b6ae752666d7d701ca9411acf474fc0c55b71e9f363b13c: Status 404 returned error can't find the container with id 92374db667b64b4f0b6ae752666d7d701ca9411acf474fc0c55b71e9f363b13c Apr 17 07:53:58.542524 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.542500 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.542647 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.542554 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncnr4\" (UniqueName: \"kubernetes.io/projected/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-kube-api-access-ncnr4\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.542647 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.542582 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-default-certificate\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.542647 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.542605 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-stats-auth\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.542647 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.542623 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" Apr 17 07:53:58.542647 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.542636 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.542647 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:58.542639 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:53:58.542939 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:58.542720 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs podName:beafd2af-5cb4-49eb-994d-2cdebba6b9ea nodeName:}" failed. No retries permitted until 2026-04-17 07:53:59.042679021 +0000 UTC m=+142.830457621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs") pod "router-default-78488888b4-7t88f" (UID: "beafd2af-5cb4-49eb-994d-2cdebba6b9ea") : secret "router-metrics-certs-default" not found Apr 17 07:53:58.542939 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:58.542799 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle podName:beafd2af-5cb4-49eb-994d-2cdebba6b9ea nodeName:}" failed. No retries permitted until 2026-04-17 07:53:59.042783212 +0000 UTC m=+142.830561795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle") pod "router-default-78488888b4-7t88f" (UID: "beafd2af-5cb4-49eb-994d-2cdebba6b9ea") : configmap references non-existent config key: service-ca.crt Apr 17 07:53:58.545101 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.545078 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-default-certificate\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.545487 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.545459 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-stats-auth\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.549002 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.548980 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-kqzwd" Apr 17 07:53:58.549925 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.549909 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncnr4\" (UniqueName: \"kubernetes.io/projected/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-kube-api-access-ncnr4\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:58.555674 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.555524 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:53:58.672091 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.672063 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq"] Apr 17 07:53:58.675288 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:53:58.675259 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5648bed_82b5_4b80_8d08_2e781e7705fc.slice/crio-110900ec0b854b7f3a5d76658ce6a5345585fa76e27f8ac1acce022e827ca491 WatchSource:0}: Error finding container 110900ec0b854b7f3a5d76658ce6a5345585fa76e27f8ac1acce022e827ca491: Status 404 returned error can't find the container with id 110900ec0b854b7f3a5d76658ce6a5345585fa76e27f8ac1acce022e827ca491 Apr 17 07:53:58.893344 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.893313 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2blbh"] Apr 17 07:53:58.896777 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.896666 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-kqzwd"] Apr 17 07:53:58.896994 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:53:58.896968 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69ef651a_e48e_4389_a551_48cc6423bcd0.slice/crio-b57b28b6e5ecdaa0ff6ab866dc09190c4a1d9137775df53e4c04d311bd0369ea WatchSource:0}: Error finding container b57b28b6e5ecdaa0ff6ab866dc09190c4a1d9137775df53e4c04d311bd0369ea: Status 404 returned error can't find the container with id b57b28b6e5ecdaa0ff6ab866dc09190c4a1d9137775df53e4c04d311bd0369ea Apr 17 07:53:58.899009 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:53:58.898981 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86c75543_1a40_464a_bce1_5fe690add66f.slice/crio-9fb4fb3b52769beb663eed74e0902ce856c7dd03ea684c404855fec403aa75ff WatchSource:0}: Error finding container 9fb4fb3b52769beb663eed74e0902ce856c7dd03ea684c404855fec403aa75ff: Status 404 returned error can't find the container with id 9fb4fb3b52769beb663eed74e0902ce856c7dd03ea684c404855fec403aa75ff Apr 17 07:53:58.945813 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:58.945741 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:53:58.945915 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:58.945891 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:53:58.945963 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:58.945952 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls podName:014f2810-c514-4886-84b9-38eb14825ce3 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:59.945933213 +0000 UTC m=+143.733711800 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2f5c2" (UID: "014f2810-c514-4886-84b9-38eb14825ce3") : secret "samples-operator-tls" not found Apr 17 07:53:59.046751 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:59.046720 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:59.046898 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:59.046868 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:53:59.046898 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:59.046872 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:53:59.046965 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:59.046932 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs podName:beafd2af-5cb4-49eb-994d-2cdebba6b9ea nodeName:}" failed. No retries permitted until 2026-04-17 07:54:00.046916751 +0000 UTC m=+143.834695332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs") pod "router-default-78488888b4-7t88f" (UID: "beafd2af-5cb4-49eb-994d-2cdebba6b9ea") : secret "router-metrics-certs-default" not found Apr 17 07:53:59.047003 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:59.046991 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle podName:beafd2af-5cb4-49eb-994d-2cdebba6b9ea nodeName:}" failed. No retries permitted until 2026-04-17 07:54:00.046979196 +0000 UTC m=+143.834757778 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle") pod "router-default-78488888b4-7t88f" (UID: "beafd2af-5cb4-49eb-994d-2cdebba6b9ea") : configmap references non-existent config key: service-ca.crt Apr 17 07:53:59.179076 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:59.179032 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28" event={"ID":"2d59fab7-6249-4bfb-844f-8cdca44acef2","Type":"ContainerStarted","Data":"92374db667b64b4f0b6ae752666d7d701ca9411acf474fc0c55b71e9f363b13c"} Apr 17 07:53:59.180126 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:59.180095 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" event={"ID":"f5648bed-82b5-4b80-8d08-2e781e7705fc","Type":"ContainerStarted","Data":"110900ec0b854b7f3a5d76658ce6a5345585fa76e27f8ac1acce022e827ca491"} Apr 17 07:53:59.181521 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:59.181472 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" event={"ID":"69ef651a-e48e-4389-a551-48cc6423bcd0","Type":"ContainerStarted","Data":"b57b28b6e5ecdaa0ff6ab866dc09190c4a1d9137775df53e4c04d311bd0369ea"} Apr 17 07:53:59.182431 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:59.182398 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kqzwd" event={"ID":"86c75543-1a40-464a-bce1-5fe690add66f","Type":"ContainerStarted","Data":"9fb4fb3b52769beb663eed74e0902ce856c7dd03ea684c404855fec403aa75ff"} Apr 17 07:53:59.957658 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:53:59.956835 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:53:59.957658 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:59.957068 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:53:59.957658 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:53:59.957151 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls podName:014f2810-c514-4886-84b9-38eb14825ce3 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:01.957128489 +0000 UTC m=+145.744907073 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2f5c2" (UID: "014f2810-c514-4886-84b9-38eb14825ce3") : secret "samples-operator-tls" not found Apr 17 07:54:00.057627 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:00.057599 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:00.057760 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:00.057718 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:00.057869 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:00.057841 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle podName:beafd2af-5cb4-49eb-994d-2cdebba6b9ea nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.057818947 +0000 UTC m=+145.845597544 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle") pod "router-default-78488888b4-7t88f" (UID: "beafd2af-5cb4-49eb-994d-2cdebba6b9ea") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:00.057950 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:00.057880 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:00.057950 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:00.057940 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs podName:beafd2af-5cb4-49eb-994d-2cdebba6b9ea nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.057921647 +0000 UTC m=+145.845700228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs") pod "router-default-78488888b4-7t88f" (UID: "beafd2af-5cb4-49eb-994d-2cdebba6b9ea") : secret "router-metrics-certs-default" not found Apr 17 07:54:00.185956 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:00.185912 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28" event={"ID":"2d59fab7-6249-4bfb-844f-8cdca44acef2","Type":"ContainerStarted","Data":"476c23918e438ceca48a594ef1d6c048242c99698f7c5fcb7a4dfc464e533147"} Apr 17 07:54:00.203120 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:00.202762 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jsm28" podStartSLOduration=0.692199971 podStartE2EDuration="2.202748225s" podCreationTimestamp="2026-04-17 07:53:58 +0000 UTC" firstStartedPulling="2026-04-17 07:53:58.540959442 +0000 UTC m=+142.328738026" lastFinishedPulling="2026-04-17 07:54:00.051507695 +0000 UTC m=+143.839286280" observedRunningTime="2026-04-17 07:54:00.202426963 +0000 UTC m=+143.990205593" watchObservedRunningTime="2026-04-17 07:54:00.202748225 +0000 UTC m=+143.990526827" Apr 17 07:54:01.976305 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:01.976198 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:54:01.976778 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:01.976306 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:01.976778 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:01.976375 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls podName:014f2810-c514-4886-84b9-38eb14825ce3 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:05.976358152 +0000 UTC m=+149.764136732 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2f5c2" (UID: "014f2810-c514-4886-84b9-38eb14825ce3") : secret "samples-operator-tls" not found Apr 17 07:54:02.077566 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:02.077529 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:02.077760 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:02.077616 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:02.077760 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:02.077738 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle podName:beafd2af-5cb4-49eb-994d-2cdebba6b9ea nodeName:}" failed. No retries permitted until 2026-04-17 07:54:06.077713932 +0000 UTC m=+149.865492513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle") pod "router-default-78488888b4-7t88f" (UID: "beafd2af-5cb4-49eb-994d-2cdebba6b9ea") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:02.077898 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:02.077768 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:02.077898 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:02.077824 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs podName:beafd2af-5cb4-49eb-994d-2cdebba6b9ea nodeName:}" failed. No retries permitted until 2026-04-17 07:54:06.077810888 +0000 UTC m=+149.865589470 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs") pod "router-default-78488888b4-7t88f" (UID: "beafd2af-5cb4-49eb-994d-2cdebba6b9ea") : secret "router-metrics-certs-default" not found Apr 17 07:54:02.196940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:02.196896 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" event={"ID":"f5648bed-82b5-4b80-8d08-2e781e7705fc","Type":"ContainerStarted","Data":"19cd6cacb21627411ce470cce820a1011a28e2c916fe55e239bdf878b282e0fe"} Apr 17 07:54:02.198547 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:02.198522 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/0.log" Apr 17 07:54:02.198718 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:02.198558 2566 generic.go:358] "Generic (PLEG): container finished" podID="69ef651a-e48e-4389-a551-48cc6423bcd0" containerID="4826fc546b8e79a17b30a4a601b0a26193d3cd77b248e80309507fc3b47f15b4" exitCode=255 Apr 17 07:54:02.198718 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:02.198594 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" event={"ID":"69ef651a-e48e-4389-a551-48cc6423bcd0","Type":"ContainerDied","Data":"4826fc546b8e79a17b30a4a601b0a26193d3cd77b248e80309507fc3b47f15b4"} Apr 17 07:54:02.198918 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:02.198896 2566 scope.go:117] "RemoveContainer" containerID="4826fc546b8e79a17b30a4a601b0a26193d3cd77b248e80309507fc3b47f15b4" Apr 17 07:54:02.200103 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:02.200069 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kqzwd" event={"ID":"86c75543-1a40-464a-bce1-5fe690add66f","Type":"ContainerStarted","Data":"7e353802a38d47cb18c916fabc18934211962bf1e0ce0a94f551d9a0b181c2b8"} Apr 17 07:54:02.213344 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:02.213290 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" podStartSLOduration=1.141082191 podStartE2EDuration="4.213272019s" podCreationTimestamp="2026-04-17 07:53:58 +0000 UTC" firstStartedPulling="2026-04-17 07:53:58.677002569 +0000 UTC m=+142.464781151" lastFinishedPulling="2026-04-17 07:54:01.749192398 +0000 UTC m=+145.536970979" observedRunningTime="2026-04-17 07:54:02.212632514 +0000 UTC m=+146.000411120" watchObservedRunningTime="2026-04-17 07:54:02.213272019 +0000 UTC m=+146.001050622" Apr 17 07:54:02.254947 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:02.254893 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-kqzwd" podStartSLOduration=1.404632711 podStartE2EDuration="4.254872723s" podCreationTimestamp="2026-04-17 07:53:58 +0000 UTC" firstStartedPulling="2026-04-17 07:53:58.900868429 +0000 UTC m=+142.688647011" lastFinishedPulling="2026-04-17 07:54:01.751108438 +0000 UTC m=+145.538887023" observedRunningTime="2026-04-17 07:54:02.234371455 +0000 UTC m=+146.022150073" watchObservedRunningTime="2026-04-17 07:54:02.254872723 +0000 UTC m=+146.042651327" Apr 17 07:54:03.204357 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:03.204328 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 07:54:03.204763 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:03.204720 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/0.log" Apr 17 07:54:03.204805 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:03.204758 2566 generic.go:358] "Generic (PLEG): container finished" podID="69ef651a-e48e-4389-a551-48cc6423bcd0" containerID="baa53091499c459f5c681905d7775517f4f2b2703b97d9f8c093bead473081e2" exitCode=255 Apr 17 07:54:03.204848 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:03.204791 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" event={"ID":"69ef651a-e48e-4389-a551-48cc6423bcd0","Type":"ContainerDied","Data":"baa53091499c459f5c681905d7775517f4f2b2703b97d9f8c093bead473081e2"} Apr 17 07:54:03.204848 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:03.204840 2566 scope.go:117] "RemoveContainer" containerID="4826fc546b8e79a17b30a4a601b0a26193d3cd77b248e80309507fc3b47f15b4" Apr 17 07:54:03.205131 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:03.205107 2566 scope.go:117] "RemoveContainer" containerID="baa53091499c459f5c681905d7775517f4f2b2703b97d9f8c093bead473081e2" Apr 17 07:54:03.205345 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:03.205320 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2blbh_openshift-console-operator(69ef651a-e48e-4389-a551-48cc6423bcd0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" podUID="69ef651a-e48e-4389-a551-48cc6423bcd0" Apr 17 07:54:04.207849 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:04.207820 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 07:54:04.208364 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:04.208188 2566 scope.go:117] "RemoveContainer" containerID="baa53091499c459f5c681905d7775517f4f2b2703b97d9f8c093bead473081e2" Apr 17 07:54:04.208364 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:04.208350 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2blbh_openshift-console-operator(69ef651a-e48e-4389-a551-48cc6423bcd0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" podUID="69ef651a-e48e-4389-a551-48cc6423bcd0" Apr 17 07:54:04.857332 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:04.857300 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qndp2_c9597c98-928c-4e9e-9f6a-20399532f672/dns-node-resolver/0.log" Apr 17 07:54:06.010388 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:06.010343 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:54:06.010794 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:06.010504 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:06.010794 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:06.010569 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls podName:014f2810-c514-4886-84b9-38eb14825ce3 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:14.010554274 +0000 UTC m=+157.798332859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2f5c2" (UID: "014f2810-c514-4886-84b9-38eb14825ce3") : secret "samples-operator-tls" not found Apr 17 07:54:06.057294 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:06.057265 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s2dk9_f847aabc-a956-47e9-91e9-a380ac142ed4/node-ca/0.log" Apr 17 07:54:06.111087 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:06.111054 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:06.111209 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:06.111130 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:06.111209 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:06.111190 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:06.111281 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:06.111251 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs podName:beafd2af-5cb4-49eb-994d-2cdebba6b9ea nodeName:}" failed. No retries permitted until 2026-04-17 07:54:14.111234906 +0000 UTC m=+157.899013487 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs") pod "router-default-78488888b4-7t88f" (UID: "beafd2af-5cb4-49eb-994d-2cdebba6b9ea") : secret "router-metrics-certs-default" not found Apr 17 07:54:06.111281 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:06.111265 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle podName:beafd2af-5cb4-49eb-994d-2cdebba6b9ea nodeName:}" failed. No retries permitted until 2026-04-17 07:54:14.111259274 +0000 UTC m=+157.899037855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle") pod "router-default-78488888b4-7t88f" (UID: "beafd2af-5cb4-49eb-994d-2cdebba6b9ea") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:08.556018 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:08.555982 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:54:08.556018 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:08.556017 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:54:08.556465 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:08.556377 2566 scope.go:117] "RemoveContainer" containerID="baa53091499c459f5c681905d7775517f4f2b2703b97d9f8c093bead473081e2" Apr 17 07:54:08.556550 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:08.556533 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2blbh_openshift-console-operator(69ef651a-e48e-4389-a551-48cc6423bcd0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" podUID="69ef651a-e48e-4389-a551-48cc6423bcd0" Apr 17 07:54:12.572160 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:12.572110 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-68954cc549-2czzq" podUID="78985eab-173e-4c9e-82d1-2bc5d78fa58f" Apr 17 07:54:12.587226 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:12.587193 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rltvj" podUID="b863c142-c069-46ab-9031-2f50beeb3f53" Apr 17 07:54:12.593348 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:12.593326 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qvwww" podUID="2354a35f-6d75-4e6e-a614-0e68c4002cb7" Apr 17 07:54:12.815158 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:12.815119 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jt4rj" podUID="4fbea707-d5c2-4c45-82e5-089d272aa922" Apr 17 07:54:13.229153 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:13.229124 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rltvj" Apr 17 07:54:13.229356 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:13.229124 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:54:13.229356 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:13.229124 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:54:14.068233 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:14.068194 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:54:14.070510 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:14.070480 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/014f2810-c514-4886-84b9-38eb14825ce3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2f5c2\" (UID: \"014f2810-c514-4886-84b9-38eb14825ce3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:54:14.134279 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:14.134235 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" Apr 17 07:54:14.169505 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:14.169464 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:14.169653 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:14.169571 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:14.170194 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:14.170170 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-service-ca-bundle\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:14.172114 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:14.172088 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/beafd2af-5cb4-49eb-994d-2cdebba6b9ea-metrics-certs\") pod \"router-default-78488888b4-7t88f\" (UID: \"beafd2af-5cb4-49eb-994d-2cdebba6b9ea\") " pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:14.238262 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:14.238232 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:14.251788 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:14.251760 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2"] Apr 17 07:54:14.365005 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:14.364972 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-78488888b4-7t88f"] Apr 17 07:54:14.367581 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:54:14.367553 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeafd2af_5cb4_49eb_994d_2cdebba6b9ea.slice/crio-83c7d8ceeafcf86fb8dadaf4c47dcdd07a8ca0be8c9c2f049eac6caee601c623 WatchSource:0}: Error finding container 83c7d8ceeafcf86fb8dadaf4c47dcdd07a8ca0be8c9c2f049eac6caee601c623: Status 404 returned error can't find the container with id 83c7d8ceeafcf86fb8dadaf4c47dcdd07a8ca0be8c9c2f049eac6caee601c623 Apr 17 07:54:15.235725 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:15.235674 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-78488888b4-7t88f" event={"ID":"beafd2af-5cb4-49eb-994d-2cdebba6b9ea","Type":"ContainerStarted","Data":"56846070a85f0334fec144de91b234c6643551dedf86e186053ed0128f8cea24"} Apr 17 07:54:15.236197 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:15.235733 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-78488888b4-7t88f" event={"ID":"beafd2af-5cb4-49eb-994d-2cdebba6b9ea","Type":"ContainerStarted","Data":"83c7d8ceeafcf86fb8dadaf4c47dcdd07a8ca0be8c9c2f049eac6caee601c623"} Apr 17 07:54:15.236778 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:15.236743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" event={"ID":"014f2810-c514-4886-84b9-38eb14825ce3","Type":"ContainerStarted","Data":"48de5efade77d9208a5ccb1cf5cb1346f503ffbd2eda914d7d1227ca940251f5"} Apr 17 07:54:15.239013 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:15.238989 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:15.241819 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:15.241797 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:15.253858 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:15.253814 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-78488888b4-7t88f" podStartSLOduration=17.253803204 podStartE2EDuration="17.253803204s" podCreationTimestamp="2026-04-17 07:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:15.252426845 +0000 UTC m=+159.040205449" watchObservedRunningTime="2026-04-17 07:54:15.253803204 +0000 UTC m=+159.041581807" Apr 17 07:54:16.240725 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:16.240683 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" event={"ID":"014f2810-c514-4886-84b9-38eb14825ce3","Type":"ContainerStarted","Data":"e3a29e8104ff795d0c482415b0507c8391f7eb0faad50ccd9ba4b7ff2d6839ba"} Apr 17 07:54:16.241092 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:16.240977 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:16.242233 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:16.242213 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-78488888b4-7t88f" Apr 17 07:54:17.244260 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.244219 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" event={"ID":"014f2810-c514-4886-84b9-38eb14825ce3","Type":"ContainerStarted","Data":"b656248e628ef102219b57707ec9804dbaa0f1f2fd67cec30de31ca537ec9c68"} Apr 17 07:54:17.261283 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.261236 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2f5c2" podStartSLOduration=17.41429224 podStartE2EDuration="19.261211325s" podCreationTimestamp="2026-04-17 07:53:58 +0000 UTC" firstStartedPulling="2026-04-17 07:54:14.287898905 +0000 UTC m=+158.075677486" lastFinishedPulling="2026-04-17 07:54:16.134817987 +0000 UTC m=+159.922596571" observedRunningTime="2026-04-17 07:54:17.260322231 +0000 UTC m=+161.048100847" watchObservedRunningTime="2026-04-17 07:54:17.261211325 +0000 UTC m=+161.048989927" Apr 17 07:54:17.503544 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.503446 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:54:17.503544 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.503495 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:54:17.503749 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.503612 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:54:17.506294 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.506272 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2354a35f-6d75-4e6e-a614-0e68c4002cb7-cert\") pod \"ingress-canary-qvwww\" (UID: \"2354a35f-6d75-4e6e-a614-0e68c4002cb7\") " pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:54:17.506382 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.506364 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b863c142-c069-46ab-9031-2f50beeb3f53-metrics-tls\") pod \"dns-default-rltvj\" (UID: \"b863c142-c069-46ab-9031-2f50beeb3f53\") " pod="openshift-dns/dns-default-rltvj" Apr 17 07:54:17.506431 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.506419 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"image-registry-68954cc549-2czzq\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:54:17.732609 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.732573 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-njxn8\"" Apr 17 07:54:17.732807 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.732727 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wmmwx\"" Apr 17 07:54:17.732854 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.732814 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xs2k6\"" Apr 17 07:54:17.740764 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.740742 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:54:17.740874 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.740790 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qvwww" Apr 17 07:54:17.740926 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.740900 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rltvj" Apr 17 07:54:17.900214 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:17.900049 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68954cc549-2czzq"] Apr 17 07:54:17.903562 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:54:17.903529 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78985eab_173e_4c9e_82d1_2bc5d78fa58f.slice/crio-77834c86d59618d18f2a81755f023d75e4a203fae81c0b8979fe87a39478a3cf WatchSource:0}: Error finding container 77834c86d59618d18f2a81755f023d75e4a203fae81c0b8979fe87a39478a3cf: Status 404 returned error can't find the container with id 77834c86d59618d18f2a81755f023d75e4a203fae81c0b8979fe87a39478a3cf Apr 17 07:54:18.137507 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:18.137391 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rltvj"] Apr 17 07:54:18.139146 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:18.139047 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qvwww"] Apr 17 07:54:18.141345 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:54:18.141316 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb863c142_c069_46ab_9031_2f50beeb3f53.slice/crio-9ebe764b5195b9f2817f8f4262e2fe585ecd0e45c9bdc2af8d979f9f8aa3bae6 WatchSource:0}: Error finding container 9ebe764b5195b9f2817f8f4262e2fe585ecd0e45c9bdc2af8d979f9f8aa3bae6: Status 404 returned error can't find the container with id 9ebe764b5195b9f2817f8f4262e2fe585ecd0e45c9bdc2af8d979f9f8aa3bae6 Apr 17 07:54:18.142098 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:54:18.142067 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2354a35f_6d75_4e6e_a614_0e68c4002cb7.slice/crio-efb7b7d59fa9faa64830207cabdd53ff05e24a3751f41c44394e242cce73165e WatchSource:0}: Error finding container efb7b7d59fa9faa64830207cabdd53ff05e24a3751f41c44394e242cce73165e: Status 404 returned error can't find the container with id efb7b7d59fa9faa64830207cabdd53ff05e24a3751f41c44394e242cce73165e Apr 17 07:54:18.248351 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:18.248318 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rltvj" event={"ID":"b863c142-c069-46ab-9031-2f50beeb3f53","Type":"ContainerStarted","Data":"9ebe764b5195b9f2817f8f4262e2fe585ecd0e45c9bdc2af8d979f9f8aa3bae6"} Apr 17 07:54:18.249421 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:18.249395 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qvwww" event={"ID":"2354a35f-6d75-4e6e-a614-0e68c4002cb7","Type":"ContainerStarted","Data":"efb7b7d59fa9faa64830207cabdd53ff05e24a3751f41c44394e242cce73165e"} Apr 17 07:54:18.250585 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:18.250558 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68954cc549-2czzq" event={"ID":"78985eab-173e-4c9e-82d1-2bc5d78fa58f","Type":"ContainerStarted","Data":"1225a55414ae4c3398c130f9410a47e32a6477e6f0eb803b9a94162611826f9a"} Apr 17 07:54:18.250681 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:18.250589 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68954cc549-2czzq" event={"ID":"78985eab-173e-4c9e-82d1-2bc5d78fa58f","Type":"ContainerStarted","Data":"77834c86d59618d18f2a81755f023d75e4a203fae81c0b8979fe87a39478a3cf"} Apr 17 07:54:18.273750 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:18.273682 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-68954cc549-2czzq" podStartSLOduration=161.273664972 podStartE2EDuration="2m41.273664972s" podCreationTimestamp="2026-04-17 07:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:18.273013751 +0000 UTC m=+162.060792354" watchObservedRunningTime="2026-04-17 07:54:18.273664972 +0000 UTC m=+162.061443576" Apr 17 07:54:19.253900 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:19.253864 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:54:19.803085 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:19.803050 2566 scope.go:117] "RemoveContainer" containerID="baa53091499c459f5c681905d7775517f4f2b2703b97d9f8c093bead473081e2" Apr 17 07:54:21.259515 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.259466 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qvwww" event={"ID":"2354a35f-6d75-4e6e-a614-0e68c4002cb7","Type":"ContainerStarted","Data":"50c9c5a79628ddda120ba04de8e71d5db136c77317b07277071ec762e4b6054d"} Apr 17 07:54:21.261052 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.261033 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 07:54:21.261161 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.261106 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" event={"ID":"69ef651a-e48e-4389-a551-48cc6423bcd0","Type":"ContainerStarted","Data":"63b46a53b81eb15743b378e45e18bcc86275940149dda0ed0cffdf58b499167b"} Apr 17 07:54:21.261398 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.261371 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:54:21.262576 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.262557 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rltvj" event={"ID":"b863c142-c069-46ab-9031-2f50beeb3f53","Type":"ContainerStarted","Data":"e8dbb79480d832694812d0a89e1b3ccf055878ef845b0ef0c85856caeb70feed"} Apr 17 07:54:21.262674 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.262579 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rltvj" event={"ID":"b863c142-c069-46ab-9031-2f50beeb3f53","Type":"ContainerStarted","Data":"47371b85b32aaa5fec71909ab9eef8bf264805a0906131057f4adb730e55f46a"} Apr 17 07:54:21.262770 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.262758 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rltvj" Apr 17 07:54:21.266431 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.266415 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" Apr 17 07:54:21.275129 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.275093 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qvwww" podStartSLOduration=130.088787588 podStartE2EDuration="2m12.275081833s" podCreationTimestamp="2026-04-17 07:52:09 +0000 UTC" firstStartedPulling="2026-04-17 07:54:18.144123252 +0000 UTC m=+161.931901847" lastFinishedPulling="2026-04-17 07:54:20.330417511 +0000 UTC m=+164.118196092" observedRunningTime="2026-04-17 07:54:21.274266485 +0000 UTC m=+165.062045088" watchObservedRunningTime="2026-04-17 07:54:21.275081833 +0000 UTC m=+165.062860430" Apr 17 07:54:21.290612 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.290569 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rltvj" podStartSLOduration=130.11877827 podStartE2EDuration="2m12.290557436s" podCreationTimestamp="2026-04-17 07:52:09 +0000 UTC" firstStartedPulling="2026-04-17 07:54:18.143717609 +0000 UTC m=+161.931496205" lastFinishedPulling="2026-04-17 07:54:20.315496776 +0000 UTC m=+164.103275371" observedRunningTime="2026-04-17 07:54:21.289508148 +0000 UTC m=+165.077286773" watchObservedRunningTime="2026-04-17 07:54:21.290557436 +0000 UTC m=+165.078336105" Apr 17 07:54:21.304774 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:21.304729 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2blbh" podStartSLOduration=20.456999782 podStartE2EDuration="23.30471439s" podCreationTimestamp="2026-04-17 07:53:58 +0000 UTC" firstStartedPulling="2026-04-17 07:53:58.898818383 +0000 UTC m=+142.686596964" lastFinishedPulling="2026-04-17 07:54:01.746532992 +0000 UTC m=+145.534311572" observedRunningTime="2026-04-17 07:54:21.304066831 +0000 UTC m=+165.091845437" watchObservedRunningTime="2026-04-17 07:54:21.30471439 +0000 UTC m=+165.092492985" Apr 17 07:54:23.803611 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:23.803561 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:54:24.009046 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.009016 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-m79hw"] Apr 17 07:54:24.012057 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.012036 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.014798 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.014776 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 07:54:24.015196 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.015173 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 07:54:24.016194 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.016166 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s2zn8\"" Apr 17 07:54:24.041713 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.040557 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m79hw"] Apr 17 07:54:24.045576 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.045546 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-z9zkf"] Apr 17 07:54:24.049158 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.048161 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-z9zkf" Apr 17 07:54:24.051647 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.051625 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-d7fzf\"" Apr 17 07:54:24.052867 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.052845 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 07:54:24.053529 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.053513 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 07:54:24.059287 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.059249 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/62634055-259b-4f42-804e-3c91faf18087-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.059380 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.059303 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8gpw\" (UniqueName: \"kubernetes.io/projected/62634055-259b-4f42-804e-3c91faf18087-kube-api-access-h8gpw\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.059380 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.059348 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/62634055-259b-4f42-804e-3c91faf18087-crio-socket\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.059493 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.059433 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/62634055-259b-4f42-804e-3c91faf18087-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.059493 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.059476 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/62634055-259b-4f42-804e-3c91faf18087-data-volume\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.064964 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.064940 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-z9zkf"] Apr 17 07:54:24.160635 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.160598 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/62634055-259b-4f42-804e-3c91faf18087-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.160881 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.160657 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8gpw\" (UniqueName: \"kubernetes.io/projected/62634055-259b-4f42-804e-3c91faf18087-kube-api-access-h8gpw\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.160881 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.160722 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/62634055-259b-4f42-804e-3c91faf18087-crio-socket\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.160881 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.160788 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/62634055-259b-4f42-804e-3c91faf18087-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.160881 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.160824 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/62634055-259b-4f42-804e-3c91faf18087-crio-socket\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.160881 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.160834 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw6cg\" (UniqueName: \"kubernetes.io/projected/ca4bdb34-07d3-4fee-8a96-eb085c419679-kube-api-access-fw6cg\") pod \"downloads-6bcc868b7-z9zkf\" (UID: \"ca4bdb34-07d3-4fee-8a96-eb085c419679\") " pod="openshift-console/downloads-6bcc868b7-z9zkf" Apr 17 07:54:24.161126 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.160907 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/62634055-259b-4f42-804e-3c91faf18087-data-volume\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.161447 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.161425 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/62634055-259b-4f42-804e-3c91faf18087-data-volume\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.161556 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.161537 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/62634055-259b-4f42-804e-3c91faf18087-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.163594 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.163573 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/62634055-259b-4f42-804e-3c91faf18087-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.169164 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.169140 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8gpw\" (UniqueName: \"kubernetes.io/projected/62634055-259b-4f42-804e-3c91faf18087-kube-api-access-h8gpw\") pod \"insights-runtime-extractor-m79hw\" (UID: \"62634055-259b-4f42-804e-3c91faf18087\") " pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.261966 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.261934 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fw6cg\" (UniqueName: \"kubernetes.io/projected/ca4bdb34-07d3-4fee-8a96-eb085c419679-kube-api-access-fw6cg\") pod \"downloads-6bcc868b7-z9zkf\" (UID: \"ca4bdb34-07d3-4fee-8a96-eb085c419679\") " pod="openshift-console/downloads-6bcc868b7-z9zkf" Apr 17 07:54:24.270025 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.269996 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw6cg\" (UniqueName: \"kubernetes.io/projected/ca4bdb34-07d3-4fee-8a96-eb085c419679-kube-api-access-fw6cg\") pod \"downloads-6bcc868b7-z9zkf\" (UID: \"ca4bdb34-07d3-4fee-8a96-eb085c419679\") " pod="openshift-console/downloads-6bcc868b7-z9zkf" Apr 17 07:54:24.321535 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.321444 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m79hw" Apr 17 07:54:24.357941 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.357895 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-z9zkf" Apr 17 07:54:24.444222 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.444160 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m79hw"] Apr 17 07:54:24.446689 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:54:24.446658 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62634055_259b_4f42_804e_3c91faf18087.slice/crio-dfb97b61bbe0fdd0a1c47c5d79847dea48a5a943a8aba66cd3a4352d4ed9203c WatchSource:0}: Error finding container dfb97b61bbe0fdd0a1c47c5d79847dea48a5a943a8aba66cd3a4352d4ed9203c: Status 404 returned error can't find the container with id dfb97b61bbe0fdd0a1c47c5d79847dea48a5a943a8aba66cd3a4352d4ed9203c Apr 17 07:54:24.482786 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:24.482760 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-z9zkf"] Apr 17 07:54:24.504801 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:54:24.504775 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca4bdb34_07d3_4fee_8a96_eb085c419679.slice/crio-64a7e9ccdcf1b08d1eb6a056c6c85ef79fb6803047489815f130eca2990f2117 WatchSource:0}: Error finding container 64a7e9ccdcf1b08d1eb6a056c6c85ef79fb6803047489815f130eca2990f2117: Status 404 returned error can't find the container with id 64a7e9ccdcf1b08d1eb6a056c6c85ef79fb6803047489815f130eca2990f2117 Apr 17 07:54:25.275920 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:25.275876 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-z9zkf" event={"ID":"ca4bdb34-07d3-4fee-8a96-eb085c419679","Type":"ContainerStarted","Data":"64a7e9ccdcf1b08d1eb6a056c6c85ef79fb6803047489815f130eca2990f2117"} Apr 17 07:54:25.277645 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:25.277619 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m79hw" event={"ID":"62634055-259b-4f42-804e-3c91faf18087","Type":"ContainerStarted","Data":"740f25511eb8e585a5548f27db6d898adffd240f7dfe12a2d4ba638f80a89215"} Apr 17 07:54:25.277778 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:25.277648 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m79hw" event={"ID":"62634055-259b-4f42-804e-3c91faf18087","Type":"ContainerStarted","Data":"2a30e40f25df56d6cace8eabe8f4355f5c63177d1b407c38906b1fceca68177c"} Apr 17 07:54:25.277778 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:25.277662 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m79hw" event={"ID":"62634055-259b-4f42-804e-3c91faf18087","Type":"ContainerStarted","Data":"dfb97b61bbe0fdd0a1c47c5d79847dea48a5a943a8aba66cd3a4352d4ed9203c"} Apr 17 07:54:27.286518 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:27.286471 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m79hw" event={"ID":"62634055-259b-4f42-804e-3c91faf18087","Type":"ContainerStarted","Data":"43527bdff364ff677fb46fbb217ac7de3562663aa6b91ea82b97e0305ef08f70"} Apr 17 07:54:27.334741 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:27.334663 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-m79hw" podStartSLOduration=2.331865411 podStartE2EDuration="4.334644929s" podCreationTimestamp="2026-04-17 07:54:23 +0000 UTC" firstStartedPulling="2026-04-17 07:54:24.521072187 +0000 UTC m=+168.308850768" lastFinishedPulling="2026-04-17 07:54:26.523851685 +0000 UTC m=+170.311630286" observedRunningTime="2026-04-17 07:54:27.333080854 +0000 UTC m=+171.120859458" watchObservedRunningTime="2026-04-17 07:54:27.334644929 +0000 UTC m=+171.122423534" Apr 17 07:54:31.269024 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:31.268989 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rltvj" Apr 17 07:54:35.737084 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.737048 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-26bv9"] Apr 17 07:54:35.741510 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.741485 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.744178 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.744151 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 07:54:35.745459 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.745218 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 07:54:35.745459 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.745295 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-jsvsv\"" Apr 17 07:54:35.745459 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.745316 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 07:54:35.745459 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.745351 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 07:54:35.745459 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.745298 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 07:54:35.753052 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.753029 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-26bv9"] Apr 17 07:54:35.861416 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.861376 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.861605 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.861469 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbpqh\" (UniqueName: \"kubernetes.io/projected/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-kube-api-access-jbpqh\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.861605 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.861490 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.861605 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.861519 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.962841 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.962801 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbpqh\" (UniqueName: \"kubernetes.io/projected/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-kube-api-access-jbpqh\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.963024 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.962845 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.963024 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.962873 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.963024 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.962946 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.963186 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:35.963041 2566 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 07:54:35.963186 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:54:35.963113 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-prometheus-operator-tls podName:3a40a7c0-c66f-4136-8ca0-2d73c7171bd4 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:36.463093117 +0000 UTC m=+180.250871700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-26bv9" (UID: "3a40a7c0-c66f-4136-8ca0-2d73c7171bd4") : secret "prometheus-operator-tls" not found Apr 17 07:54:35.963639 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.963615 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.965660 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.965635 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:35.973845 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:35.973817 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbpqh\" (UniqueName: \"kubernetes.io/projected/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-kube-api-access-jbpqh\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:36.467310 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:36.467271 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:36.470003 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:36.469978 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a40a7c0-c66f-4136-8ca0-2d73c7171bd4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-26bv9\" (UID: \"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:36.652017 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:36.651987 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" Apr 17 07:54:37.745807 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:37.745761 2566 patch_prober.go:28] interesting pod/image-registry-68954cc549-2czzq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:54:37.746225 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:37.745817 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-68954cc549-2czzq" podUID="78985eab-173e-4c9e-82d1-2bc5d78fa58f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:54:39.928269 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:39.928244 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-26bv9"] Apr 17 07:54:39.931587 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:54:39.931562 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a40a7c0_c66f_4136_8ca0_2d73c7171bd4.slice/crio-d89a22d83f05af7100ec667787239848361f749afc9382287518dd4b7b92db1f WatchSource:0}: Error finding container d89a22d83f05af7100ec667787239848361f749afc9382287518dd4b7b92db1f: Status 404 returned error can't find the container with id d89a22d83f05af7100ec667787239848361f749afc9382287518dd4b7b92db1f Apr 17 07:54:40.262659 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.262583 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:54:40.326369 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.326318 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" event={"ID":"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4","Type":"ContainerStarted","Data":"d89a22d83f05af7100ec667787239848361f749afc9382287518dd4b7b92db1f"} Apr 17 07:54:40.329715 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.329662 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-z9zkf" event={"ID":"ca4bdb34-07d3-4fee-8a96-eb085c419679","Type":"ContainerStarted","Data":"4e95b144c462da411aa31f5155ada042f692f1bfeec666d5b43dac897a5df447"} Apr 17 07:54:40.347406 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.347350 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-z9zkf" podStartSLOduration=0.941613177 podStartE2EDuration="16.347330736s" podCreationTimestamp="2026-04-17 07:54:24 +0000 UTC" firstStartedPulling="2026-04-17 07:54:24.506681006 +0000 UTC m=+168.294459591" lastFinishedPulling="2026-04-17 07:54:39.912398557 +0000 UTC m=+183.700177150" observedRunningTime="2026-04-17 07:54:40.346664391 +0000 UTC m=+184.134442995" watchObservedRunningTime="2026-04-17 07:54:40.347330736 +0000 UTC m=+184.135109341" Apr 17 07:54:40.403940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.403901 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fc57765d4-87l9f"] Apr 17 07:54:40.407304 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.407274 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.410254 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.410217 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 07:54:40.410758 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.410738 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 07:54:40.411271 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.410763 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jmjjr\"" Apr 17 07:54:40.411646 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.410782 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 07:54:40.411646 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.411107 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 07:54:40.411646 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.411184 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 07:54:40.417284 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.417257 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fc57765d4-87l9f"] Apr 17 07:54:40.417524 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.417498 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 07:54:40.501107 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.501067 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-config\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.501329 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.501153 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-serving-cert\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.501329 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.501232 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-oauth-config\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.501329 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.501264 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-service-ca\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.501329 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.501298 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-trusted-ca-bundle\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.501553 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.501332 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2d94\" (UniqueName: \"kubernetes.io/projected/bc274a7b-7833-4adb-94c6-734b2ada29a4-kube-api-access-r2d94\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.501553 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.501404 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-oauth-serving-cert\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.602564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.602440 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-config\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.602564 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.602516 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-serving-cert\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.603305 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.602587 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-oauth-config\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.603305 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.602619 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-service-ca\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.603305 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.602673 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-trusted-ca-bundle\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.603305 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.602729 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2d94\" (UniqueName: \"kubernetes.io/projected/bc274a7b-7833-4adb-94c6-734b2ada29a4-kube-api-access-r2d94\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.603305 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.602757 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-oauth-serving-cert\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.603586 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.603532 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-service-ca\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.603848 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.603790 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-config\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.604888 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.604845 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-trusted-ca-bundle\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.605219 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.605144 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-oauth-serving-cert\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.605919 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.605895 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-serving-cert\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.606244 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.606220 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-oauth-config\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.612139 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.612106 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2d94\" (UniqueName: \"kubernetes.io/projected/bc274a7b-7833-4adb-94c6-734b2ada29a4-kube-api-access-r2d94\") pod \"console-5fc57765d4-87l9f\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:40.722782 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:40.722680 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:41.127524 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:41.127417 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fc57765d4-87l9f"] Apr 17 07:54:41.132118 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:54:41.132086 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc274a7b_7833_4adb_94c6_734b2ada29a4.slice/crio-acd1136e82d9aed481a15958fe0d983fad1d0c72210c17c0e68225c9b964d7f1 WatchSource:0}: Error finding container acd1136e82d9aed481a15958fe0d983fad1d0c72210c17c0e68225c9b964d7f1: Status 404 returned error can't find the container with id acd1136e82d9aed481a15958fe0d983fad1d0c72210c17c0e68225c9b964d7f1 Apr 17 07:54:41.334769 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:41.334732 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" event={"ID":"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4","Type":"ContainerStarted","Data":"076f6fa225d9c0e43b0531216b7f7f26d49ced22a5a1c658a99d81ad509fdb38"} Apr 17 07:54:41.334969 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:41.334779 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" event={"ID":"3a40a7c0-c66f-4136-8ca0-2d73c7171bd4","Type":"ContainerStarted","Data":"fb3e022aaeeb3050f00678bb5858612abdff89dd0d8bbfc0ec3c0c4ed10a4619"} Apr 17 07:54:41.335939 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:41.335903 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc57765d4-87l9f" event={"ID":"bc274a7b-7833-4adb-94c6-734b2ada29a4","Type":"ContainerStarted","Data":"acd1136e82d9aed481a15958fe0d983fad1d0c72210c17c0e68225c9b964d7f1"} Apr 17 07:54:41.336139 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:41.336114 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-z9zkf" Apr 17 07:54:41.352996 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:41.352934 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-26bv9" podStartSLOduration=5.25459086 podStartE2EDuration="6.352916016s" podCreationTimestamp="2026-04-17 07:54:35 +0000 UTC" firstStartedPulling="2026-04-17 07:54:39.933594876 +0000 UTC m=+183.721373457" lastFinishedPulling="2026-04-17 07:54:41.031920018 +0000 UTC m=+184.819698613" observedRunningTime="2026-04-17 07:54:41.352049716 +0000 UTC m=+185.139828321" watchObservedRunningTime="2026-04-17 07:54:41.352916016 +0000 UTC m=+185.140694621" Apr 17 07:54:41.355738 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:41.355692 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-z9zkf" Apr 17 07:54:43.298561 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.298521 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pg89w"] Apr 17 07:54:43.327527 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.327417 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.330123 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.330093 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 07:54:43.330498 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.330409 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-9c8dc\"" Apr 17 07:54:43.330622 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.330465 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 07:54:43.330784 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.330769 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 07:54:43.429464 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.429428 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.429761 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.429480 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-sys\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.429761 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.429508 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl2bz\" (UniqueName: \"kubernetes.io/projected/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-kube-api-access-zl2bz\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.429761 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.429549 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-textfile\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.429761 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.429616 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-root\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.429761 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.429650 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-wtmp\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.430928 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.429863 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-metrics-client-ca\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.430928 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.430265 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-tls\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.430928 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.430424 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-accelerators-collector-config\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.531759 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.531680 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-metrics-client-ca\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.532146 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.531945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-tls\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.532146 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.531998 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-accelerators-collector-config\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.532146 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.532034 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.532354 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.532217 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-sys\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.532354 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.532256 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zl2bz\" (UniqueName: \"kubernetes.io/projected/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-kube-api-access-zl2bz\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.532354 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.532304 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-textfile\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.532682 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.532432 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-root\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.532682 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.532467 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-wtmp\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.532682 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.532532 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-root\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.532682 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.532430 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-sys\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.534352 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.532726 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-wtmp\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.534352 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.532959 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-metrics-client-ca\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.534352 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.533525 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-textfile\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.534352 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.533793 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-accelerators-collector-config\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.535907 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.535863 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.537285 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.537012 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-node-exporter-tls\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.553823 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.553733 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl2bz\" (UniqueName: \"kubernetes.io/projected/97ea7ab8-8bd4-4f02-af41-c58d6ec791ea-kube-api-access-zl2bz\") pod \"node-exporter-pg89w\" (UID: \"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea\") " pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:43.642164 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:43.642123 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pg89w" Apr 17 07:54:44.206320 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.205314 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:54:44.234055 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.233781 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:54:44.234055 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.233990 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.237107 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.237029 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 07:54:44.237717 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.237559 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 07:54:44.237717 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.237613 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hm8lj\"" Apr 17 07:54:44.237717 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.237620 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 07:54:44.237717 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.237561 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 07:54:44.237717 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.237559 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 07:54:44.237717 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.237568 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 07:54:44.238529 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.237927 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 07:54:44.238529 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.237997 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 07:54:44.239153 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.239097 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 07:54:44.340495 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340457 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f771cd-9a20-42f3-b071-e8907d147087-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340515 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-config-volume\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340572 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340619 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340656 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33f771cd-9a20-42f3-b071-e8907d147087-tls-assets\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340683 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/33f771cd-9a20-42f3-b071-e8907d147087-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340732 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33f771cd-9a20-42f3-b071-e8907d147087-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340811 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr5v9\" (UniqueName: \"kubernetes.io/projected/33f771cd-9a20-42f3-b071-e8907d147087-kube-api-access-mr5v9\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340845 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340861 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340891 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33f771cd-9a20-42f3-b071-e8907d147087-config-out\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340925 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.340940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.340943 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-web-config\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.385378 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:54:44.385335 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ea7ab8_8bd4_4f02_af41_c58d6ec791ea.slice/crio-0325ae3b461fb1d1422e009e47057040df4b77ee4439efd8beeaf4a71038f1ba WatchSource:0}: Error finding container 0325ae3b461fb1d1422e009e47057040df4b77ee4439efd8beeaf4a71038f1ba: Status 404 returned error can't find the container with id 0325ae3b461fb1d1422e009e47057040df4b77ee4439efd8beeaf4a71038f1ba Apr 17 07:54:44.442379 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.442340 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f771cd-9a20-42f3-b071-e8907d147087-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.442555 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.442406 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-config-volume\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.442659 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.442625 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.442786 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.442668 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.442786 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.442724 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33f771cd-9a20-42f3-b071-e8907d147087-tls-assets\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.442786 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.442754 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/33f771cd-9a20-42f3-b071-e8907d147087-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.442786 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.442779 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33f771cd-9a20-42f3-b071-e8907d147087-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.442962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.442836 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr5v9\" (UniqueName: \"kubernetes.io/projected/33f771cd-9a20-42f3-b071-e8907d147087-kube-api-access-mr5v9\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.442962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.442869 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.443064 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.443045 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/33f771cd-9a20-42f3-b071-e8907d147087-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.443995 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.443212 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.443995 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.443294 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33f771cd-9a20-42f3-b071-e8907d147087-config-out\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.443995 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.443345 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.443995 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.443374 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-web-config\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.443995 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.443483 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33f771cd-9a20-42f3-b071-e8907d147087-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.446028 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.446000 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33f771cd-9a20-42f3-b071-e8907d147087-tls-assets\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.446128 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.446089 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.446246 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.446220 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-config-volume\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.446549 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.446328 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33f771cd-9a20-42f3-b071-e8907d147087-config-out\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.446722 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.446677 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.446856 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.446791 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.446917 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.446851 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.447013 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.446989 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.447739 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.447720 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33f771cd-9a20-42f3-b071-e8907d147087-web-config\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.449126 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.449104 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f771cd-9a20-42f3-b071-e8907d147087-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.458089 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.458040 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr5v9\" (UniqueName: \"kubernetes.io/projected/33f771cd-9a20-42f3-b071-e8907d147087-kube-api-access-mr5v9\") pod \"alertmanager-main-0\" (UID: \"33f771cd-9a20-42f3-b071-e8907d147087\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.550320 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.550279 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:44.781745 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:44.781588 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:54:45.358656 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:45.358615 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc57765d4-87l9f" event={"ID":"bc274a7b-7833-4adb-94c6-734b2ada29a4","Type":"ContainerStarted","Data":"bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4"} Apr 17 07:54:45.360031 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:45.359998 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33f771cd-9a20-42f3-b071-e8907d147087","Type":"ContainerStarted","Data":"ac9beec3d6e6f0c04f399996c730b157a31d02d6e604d9bc860402fe33111448"} Apr 17 07:54:45.361369 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:45.361330 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pg89w" event={"ID":"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea","Type":"ContainerStarted","Data":"0325ae3b461fb1d1422e009e47057040df4b77ee4439efd8beeaf4a71038f1ba"} Apr 17 07:54:45.379538 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:45.379480 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fc57765d4-87l9f" podStartSLOduration=1.880985045 podStartE2EDuration="5.379463445s" podCreationTimestamp="2026-04-17 07:54:40 +0000 UTC" firstStartedPulling="2026-04-17 07:54:41.134454278 +0000 UTC m=+184.922232874" lastFinishedPulling="2026-04-17 07:54:44.632932678 +0000 UTC m=+188.420711274" observedRunningTime="2026-04-17 07:54:45.377589448 +0000 UTC m=+189.165368081" watchObservedRunningTime="2026-04-17 07:54:45.379463445 +0000 UTC m=+189.167242047" Apr 17 07:54:46.315961 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:46.315127 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68954cc549-2czzq"] Apr 17 07:54:46.366467 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:46.366436 2566 generic.go:358] "Generic (PLEG): container finished" podID="97ea7ab8-8bd4-4f02-af41-c58d6ec791ea" containerID="673651768b4ba2f47af7b41ea48b31cc83f05be0304c7bbd88171d1a71d627b5" exitCode=0 Apr 17 07:54:46.366940 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:46.366524 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pg89w" event={"ID":"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea","Type":"ContainerDied","Data":"673651768b4ba2f47af7b41ea48b31cc83f05be0304c7bbd88171d1a71d627b5"} Apr 17 07:54:47.371164 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:47.371123 2566 generic.go:358] "Generic (PLEG): container finished" podID="33f771cd-9a20-42f3-b071-e8907d147087" containerID="4395cb05361057642fe3bcb41f358578cf945a5b37307ee51f8c48f81c583ce4" exitCode=0 Apr 17 07:54:47.371686 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:47.371182 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33f771cd-9a20-42f3-b071-e8907d147087","Type":"ContainerDied","Data":"4395cb05361057642fe3bcb41f358578cf945a5b37307ee51f8c48f81c583ce4"} Apr 17 07:54:47.374018 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:47.373993 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pg89w" event={"ID":"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea","Type":"ContainerStarted","Data":"c277f99134d46c8c6894b704741f52e7710c234aca8a309974c11a4565551dac"} Apr 17 07:54:47.374120 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:47.374024 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pg89w" event={"ID":"97ea7ab8-8bd4-4f02-af41-c58d6ec791ea","Type":"ContainerStarted","Data":"bf1f4830e49c23a7631668e6afd5f7b58f2c5eca92f890e3cf855b3ed5f874c4"} Apr 17 07:54:47.426375 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:47.426318 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pg89w" podStartSLOduration=3.17974871 podStartE2EDuration="4.426299165s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:44.387538048 +0000 UTC m=+188.175316629" lastFinishedPulling="2026-04-17 07:54:45.6340885 +0000 UTC m=+189.421867084" observedRunningTime="2026-04-17 07:54:47.424893406 +0000 UTC m=+191.212672023" watchObservedRunningTime="2026-04-17 07:54:47.426299165 +0000 UTC m=+191.214077773" Apr 17 07:54:48.503593 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.503557 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6c7b978f98-zt7cg"] Apr 17 07:54:48.526673 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.526619 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6c7b978f98-zt7cg"] Apr 17 07:54:48.526898 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.526810 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.531760 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.531397 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 07:54:48.531760 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.531532 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 07:54:48.531760 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.531751 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 07:54:48.532030 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.531397 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-wzl2g\"" Apr 17 07:54:48.532030 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.531941 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 07:54:48.532136 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.532053 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 07:54:48.536400 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.536329 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 07:54:48.685455 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.685411 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-telemeter-client-tls\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.685455 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.685465 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbw6n\" (UniqueName: \"kubernetes.io/projected/d45b2571-f6bf-4eba-8a2f-db3a12922e55-kube-api-access-vbw6n\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.685911 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.685512 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-federate-client-tls\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.685911 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.685541 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.685911 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.685572 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-secret-telemeter-client\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.685911 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.685606 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45b2571-f6bf-4eba-8a2f-db3a12922e55-serving-certs-ca-bundle\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.685911 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.685661 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d45b2571-f6bf-4eba-8a2f-db3a12922e55-metrics-client-ca\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.685911 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.685725 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45b2571-f6bf-4eba-8a2f-db3a12922e55-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.787054 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.787006 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45b2571-f6bf-4eba-8a2f-db3a12922e55-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.787268 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.787087 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-telemeter-client-tls\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.787268 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.787124 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbw6n\" (UniqueName: \"kubernetes.io/projected/d45b2571-f6bf-4eba-8a2f-db3a12922e55-kube-api-access-vbw6n\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.787268 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.787213 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-federate-client-tls\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.787268 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.787245 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.787498 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.787279 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-secret-telemeter-client\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.787498 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.787314 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45b2571-f6bf-4eba-8a2f-db3a12922e55-serving-certs-ca-bundle\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.787498 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.787367 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d45b2571-f6bf-4eba-8a2f-db3a12922e55-metrics-client-ca\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.788164 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.788135 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45b2571-f6bf-4eba-8a2f-db3a12922e55-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.788715 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.788673 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d45b2571-f6bf-4eba-8a2f-db3a12922e55-serving-certs-ca-bundle\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.790791 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.790772 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-secret-telemeter-client\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.790962 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.790939 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-federate-client-tls\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.791129 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.791107 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.791198 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.791180 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d45b2571-f6bf-4eba-8a2f-db3a12922e55-telemeter-client-tls\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.800770 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.800744 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d45b2571-f6bf-4eba-8a2f-db3a12922e55-metrics-client-ca\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.803216 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.803193 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbw6n\" (UniqueName: \"kubernetes.io/projected/d45b2571-f6bf-4eba-8a2f-db3a12922e55-kube-api-access-vbw6n\") pod \"telemeter-client-6c7b978f98-zt7cg\" (UID: \"d45b2571-f6bf-4eba-8a2f-db3a12922e55\") " pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:48.845501 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:48.845456 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" Apr 17 07:54:49.000844 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:49.000809 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6c7b978f98-zt7cg"] Apr 17 07:54:49.108165 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:54:49.108087 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd45b2571_f6bf_4eba_8a2f_db3a12922e55.slice/crio-78aef7ba7742491e1437ac0285665be3973745cc30cbaed4eef3c7412815d5bb WatchSource:0}: Error finding container 78aef7ba7742491e1437ac0285665be3973745cc30cbaed4eef3c7412815d5bb: Status 404 returned error can't find the container with id 78aef7ba7742491e1437ac0285665be3973745cc30cbaed4eef3c7412815d5bb Apr 17 07:54:49.383195 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:49.383160 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" event={"ID":"d45b2571-f6bf-4eba-8a2f-db3a12922e55","Type":"ContainerStarted","Data":"78aef7ba7742491e1437ac0285665be3973745cc30cbaed4eef3c7412815d5bb"} Apr 17 07:54:49.385431 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:49.385385 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33f771cd-9a20-42f3-b071-e8907d147087","Type":"ContainerStarted","Data":"786d4ffa8d61b8a500fec87434952f0ed689c7e48810e0beaac428ac4c7375ba"} Apr 17 07:54:49.818884 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:49.817979 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fc57765d4-87l9f"] Apr 17 07:54:50.393008 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:50.392968 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33f771cd-9a20-42f3-b071-e8907d147087","Type":"ContainerStarted","Data":"d796575ef9ec2f81dccc2f582129ec06b154b4be57d620a49293da49dce261bc"} Apr 17 07:54:50.393008 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:50.393011 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33f771cd-9a20-42f3-b071-e8907d147087","Type":"ContainerStarted","Data":"dd0acc002d6a7b8f58cd9b3a457e39be4fb32c2a9e532ae9201dc655a54403d0"} Apr 17 07:54:50.393315 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:50.393025 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33f771cd-9a20-42f3-b071-e8907d147087","Type":"ContainerStarted","Data":"e01467d4f51e462e35a5fc43aca6620b1e9e59a64b0fca4e6ec7f226bb79a1ee"} Apr 17 07:54:50.393315 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:50.393038 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33f771cd-9a20-42f3-b071-e8907d147087","Type":"ContainerStarted","Data":"7c9842677141441671576cea5182ecdedaee6ab4386e03730719f36b7b75fb59"} Apr 17 07:54:50.723970 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:50.723864 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:54:52.403458 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:52.403417 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33f771cd-9a20-42f3-b071-e8907d147087","Type":"ContainerStarted","Data":"7c82ad103adb88d86a7e1ddcaaa5756cf4b6c659caeddb2a62057f37ebce7547"} Apr 17 07:54:52.405207 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:52.405172 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" event={"ID":"d45b2571-f6bf-4eba-8a2f-db3a12922e55","Type":"ContainerStarted","Data":"941e0a293c0acdb2bf4a5c49de339c71ebf6d8aac24c6dc4d9d66a90306d2ff5"} Apr 17 07:54:52.405207 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:52.405202 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" event={"ID":"d45b2571-f6bf-4eba-8a2f-db3a12922e55","Type":"ContainerStarted","Data":"a8bffe0d53a625bcb9b6f590c1f01b057526f56f0cc811b05fe2866ab5b2e01c"} Apr 17 07:54:52.405207 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:52.405211 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" event={"ID":"d45b2571-f6bf-4eba-8a2f-db3a12922e55","Type":"ContainerStarted","Data":"652788c8a84914886fbdfca14309c2f6f2caa31ecc0fb4a23a9f35c0425e1553"} Apr 17 07:54:52.444432 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:52.444380 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.724453774 podStartE2EDuration="8.444365532s" podCreationTimestamp="2026-04-17 07:54:44 +0000 UTC" firstStartedPulling="2026-04-17 07:54:44.789326741 +0000 UTC m=+188.577105328" lastFinishedPulling="2026-04-17 07:54:51.509238501 +0000 UTC m=+195.297017086" observedRunningTime="2026-04-17 07:54:52.442407084 +0000 UTC m=+196.230185688" watchObservedRunningTime="2026-04-17 07:54:52.444365532 +0000 UTC m=+196.232144136" Apr 17 07:54:52.476349 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:54:52.476280 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6c7b978f98-zt7cg" podStartSLOduration=2.09404275 podStartE2EDuration="4.476262908s" podCreationTimestamp="2026-04-17 07:54:48 +0000 UTC" firstStartedPulling="2026-04-17 07:54:49.125342616 +0000 UTC m=+192.913121197" lastFinishedPulling="2026-04-17 07:54:51.50756277 +0000 UTC m=+195.295341355" observedRunningTime="2026-04-17 07:54:52.475004008 +0000 UTC m=+196.262782611" watchObservedRunningTime="2026-04-17 07:54:52.476262908 +0000 UTC m=+196.264041510" Apr 17 07:55:11.343089 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.343030 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-68954cc549-2czzq" podUID="78985eab-173e-4c9e-82d1-2bc5d78fa58f" containerName="registry" containerID="cri-o://1225a55414ae4c3398c130f9410a47e32a6477e6f0eb803b9a94162611826f9a" gracePeriod=30 Apr 17 07:55:11.460193 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.460162 2566 generic.go:358] "Generic (PLEG): container finished" podID="78985eab-173e-4c9e-82d1-2bc5d78fa58f" containerID="1225a55414ae4c3398c130f9410a47e32a6477e6f0eb803b9a94162611826f9a" exitCode=0 Apr 17 07:55:11.460329 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.460238 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68954cc549-2czzq" event={"ID":"78985eab-173e-4c9e-82d1-2bc5d78fa58f","Type":"ContainerDied","Data":"1225a55414ae4c3398c130f9410a47e32a6477e6f0eb803b9a94162611826f9a"} Apr 17 07:55:11.606394 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.606372 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:55:11.690626 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.690591 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-installation-pull-secrets\") pod \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " Apr 17 07:55:11.690821 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.690637 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-bound-sa-token\") pod \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " Apr 17 07:55:11.690821 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.690666 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") pod \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " Apr 17 07:55:11.690821 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.690718 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-image-registry-private-configuration\") pod \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " Apr 17 07:55:11.690982 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.690874 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78985eab-173e-4c9e-82d1-2bc5d78fa58f-ca-trust-extracted\") pod \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " Apr 17 07:55:11.690982 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.690948 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dlk5\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-kube-api-access-8dlk5\") pod \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " Apr 17 07:55:11.691072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.690994 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-certificates\") pod \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " Apr 17 07:55:11.691747 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.691690 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-trusted-ca\") pod \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\" (UID: \"78985eab-173e-4c9e-82d1-2bc5d78fa58f\") " Apr 17 07:55:11.692429 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.692404 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "78985eab-173e-4c9e-82d1-2bc5d78fa58f" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:11.693680 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.693648 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "78985eab-173e-4c9e-82d1-2bc5d78fa58f" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:55:11.694038 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.693882 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-bound-sa-token\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:11.694038 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.693905 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-trusted-ca\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:11.694241 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.694212 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "78985eab-173e-4c9e-82d1-2bc5d78fa58f" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:55:11.694348 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.694328 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "78985eab-173e-4c9e-82d1-2bc5d78fa58f" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:11.698883 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.694796 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "78985eab-173e-4c9e-82d1-2bc5d78fa58f" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:55:11.698883 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.694792 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-kube-api-access-8dlk5" (OuterVolumeSpecName: "kube-api-access-8dlk5") pod "78985eab-173e-4c9e-82d1-2bc5d78fa58f" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f"). InnerVolumeSpecName "kube-api-access-8dlk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:55:11.699059 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.699015 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "78985eab-173e-4c9e-82d1-2bc5d78fa58f" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:55:11.703079 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.703056 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78985eab-173e-4c9e-82d1-2bc5d78fa58f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "78985eab-173e-4c9e-82d1-2bc5d78fa58f" (UID: "78985eab-173e-4c9e-82d1-2bc5d78fa58f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:55:11.794676 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.794639 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-installation-pull-secrets\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:11.794676 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.794671 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:11.794676 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.794683 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/78985eab-173e-4c9e-82d1-2bc5d78fa58f-image-registry-private-configuration\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:11.794929 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.794715 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/78985eab-173e-4c9e-82d1-2bc5d78fa58f-ca-trust-extracted\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:11.794929 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.794725 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dlk5\" (UniqueName: \"kubernetes.io/projected/78985eab-173e-4c9e-82d1-2bc5d78fa58f-kube-api-access-8dlk5\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:11.794929 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:11.794734 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/78985eab-173e-4c9e-82d1-2bc5d78fa58f-registry-certificates\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:12.464308 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:12.464267 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68954cc549-2czzq" event={"ID":"78985eab-173e-4c9e-82d1-2bc5d78fa58f","Type":"ContainerDied","Data":"77834c86d59618d18f2a81755f023d75e4a203fae81c0b8979fe87a39478a3cf"} Apr 17 07:55:12.464775 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:12.464313 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68954cc549-2czzq" Apr 17 07:55:12.464775 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:12.464321 2566 scope.go:117] "RemoveContainer" containerID="1225a55414ae4c3398c130f9410a47e32a6477e6f0eb803b9a94162611826f9a" Apr 17 07:55:12.485405 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:12.485382 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68954cc549-2czzq"] Apr 17 07:55:12.489402 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:12.489379 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-68954cc549-2czzq"] Apr 17 07:55:12.807109 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:12.807072 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78985eab-173e-4c9e-82d1-2bc5d78fa58f" path="/var/lib/kubelet/pods/78985eab-173e-4c9e-82d1-2bc5d78fa58f/volumes" Apr 17 07:55:14.842638 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:14.842597 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fc57765d4-87l9f" podUID="bc274a7b-7833-4adb-94c6-734b2ada29a4" containerName="console" containerID="cri-o://bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4" gracePeriod=15 Apr 17 07:55:15.095890 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.095833 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fc57765d4-87l9f_bc274a7b-7833-4adb-94c6-734b2ada29a4/console/0.log" Apr 17 07:55:15.096006 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.095903 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:55:15.226235 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226200 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-oauth-config\") pod \"bc274a7b-7833-4adb-94c6-734b2ada29a4\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " Apr 17 07:55:15.226433 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226247 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-service-ca\") pod \"bc274a7b-7833-4adb-94c6-734b2ada29a4\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " Apr 17 07:55:15.226433 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226269 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-trusted-ca-bundle\") pod \"bc274a7b-7833-4adb-94c6-734b2ada29a4\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " Apr 17 07:55:15.226433 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226403 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-config\") pod \"bc274a7b-7833-4adb-94c6-734b2ada29a4\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " Apr 17 07:55:15.226592 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226448 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2d94\" (UniqueName: \"kubernetes.io/projected/bc274a7b-7833-4adb-94c6-734b2ada29a4-kube-api-access-r2d94\") pod \"bc274a7b-7833-4adb-94c6-734b2ada29a4\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " Apr 17 07:55:15.226592 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226485 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-oauth-serving-cert\") pod \"bc274a7b-7833-4adb-94c6-734b2ada29a4\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " Apr 17 07:55:15.226592 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226511 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-serving-cert\") pod \"bc274a7b-7833-4adb-94c6-734b2ada29a4\" (UID: \"bc274a7b-7833-4adb-94c6-734b2ada29a4\") " Apr 17 07:55:15.226783 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226728 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bc274a7b-7833-4adb-94c6-734b2ada29a4" (UID: "bc274a7b-7833-4adb-94c6-734b2ada29a4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:15.226783 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226754 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-service-ca" (OuterVolumeSpecName: "service-ca") pod "bc274a7b-7833-4adb-94c6-734b2ada29a4" (UID: "bc274a7b-7833-4adb-94c6-734b2ada29a4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:15.226959 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226929 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-config" (OuterVolumeSpecName: "console-config") pod "bc274a7b-7833-4adb-94c6-734b2ada29a4" (UID: "bc274a7b-7833-4adb-94c6-734b2ada29a4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:15.227077 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.226985 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bc274a7b-7833-4adb-94c6-734b2ada29a4" (UID: "bc274a7b-7833-4adb-94c6-734b2ada29a4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:15.228655 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.228635 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bc274a7b-7833-4adb-94c6-734b2ada29a4" (UID: "bc274a7b-7833-4adb-94c6-734b2ada29a4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:55:15.228755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.228679 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc274a7b-7833-4adb-94c6-734b2ada29a4-kube-api-access-r2d94" (OuterVolumeSpecName: "kube-api-access-r2d94") pod "bc274a7b-7833-4adb-94c6-734b2ada29a4" (UID: "bc274a7b-7833-4adb-94c6-734b2ada29a4"). InnerVolumeSpecName "kube-api-access-r2d94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:55:15.228755 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.228738 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bc274a7b-7833-4adb-94c6-734b2ada29a4" (UID: "bc274a7b-7833-4adb-94c6-734b2ada29a4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:55:15.328143 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.328107 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r2d94\" (UniqueName: \"kubernetes.io/projected/bc274a7b-7833-4adb-94c6-734b2ada29a4-kube-api-access-r2d94\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:15.328143 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.328137 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-oauth-serving-cert\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:15.328143 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.328148 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-serving-cert\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:15.328374 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.328157 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-oauth-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:15.328374 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.328167 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-service-ca\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:15.328374 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.328175 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-trusted-ca-bundle\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:15.328374 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.328185 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc274a7b-7833-4adb-94c6-734b2ada29a4-console-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 07:55:15.475722 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.475630 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fc57765d4-87l9f_bc274a7b-7833-4adb-94c6-734b2ada29a4/console/0.log" Apr 17 07:55:15.475722 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.475671 2566 generic.go:358] "Generic (PLEG): container finished" podID="bc274a7b-7833-4adb-94c6-734b2ada29a4" containerID="bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4" exitCode=2 Apr 17 07:55:15.475941 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.475716 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc57765d4-87l9f" event={"ID":"bc274a7b-7833-4adb-94c6-734b2ada29a4","Type":"ContainerDied","Data":"bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4"} Apr 17 07:55:15.475941 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.475758 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc57765d4-87l9f" event={"ID":"bc274a7b-7833-4adb-94c6-734b2ada29a4","Type":"ContainerDied","Data":"acd1136e82d9aed481a15958fe0d983fad1d0c72210c17c0e68225c9b964d7f1"} Apr 17 07:55:15.475941 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.475775 2566 scope.go:117] "RemoveContainer" containerID="bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4" Apr 17 07:55:15.475941 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.475774 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc57765d4-87l9f" Apr 17 07:55:15.483672 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.483657 2566 scope.go:117] "RemoveContainer" containerID="bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4" Apr 17 07:55:15.483948 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:55:15.483924 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4\": container with ID starting with bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4 not found: ID does not exist" containerID="bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4" Apr 17 07:55:15.484040 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.483955 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4"} err="failed to get container status \"bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4\": rpc error: code = NotFound desc = could not find container \"bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4\": container with ID starting with bdab8e7ca70e720a24d85530e9af4efd21cebd62afd076071d2298f3d3db3dc4 not found: ID does not exist" Apr 17 07:55:15.495392 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.495371 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fc57765d4-87l9f"] Apr 17 07:55:15.499193 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:15.499172 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fc57765d4-87l9f"] Apr 17 07:55:16.810070 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:16.810024 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc274a7b-7833-4adb-94c6-734b2ada29a4" path="/var/lib/kubelet/pods/bc274a7b-7833-4adb-94c6-734b2ada29a4/volumes" Apr 17 07:55:27.516750 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:27.516710 2566 generic.go:358] "Generic (PLEG): container finished" podID="86c75543-1a40-464a-bce1-5fe690add66f" containerID="7e353802a38d47cb18c916fabc18934211962bf1e0ce0a94f551d9a0b181c2b8" exitCode=0 Apr 17 07:55:27.517332 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:27.516770 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kqzwd" event={"ID":"86c75543-1a40-464a-bce1-5fe690add66f","Type":"ContainerDied","Data":"7e353802a38d47cb18c916fabc18934211962bf1e0ce0a94f551d9a0b181c2b8"} Apr 17 07:55:27.517332 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:27.517093 2566 scope.go:117] "RemoveContainer" containerID="7e353802a38d47cb18c916fabc18934211962bf1e0ce0a94f551d9a0b181c2b8" Apr 17 07:55:28.521100 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:28.521064 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kqzwd" event={"ID":"86c75543-1a40-464a-bce1-5fe690add66f","Type":"ContainerStarted","Data":"1d05cf0251a0198740b189533a1c7a83db9c090fee2c429c73f1dd3f57ee27aa"} Apr 17 07:55:32.532593 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:32.532560 2566 generic.go:358] "Generic (PLEG): container finished" podID="f5648bed-82b5-4b80-8d08-2e781e7705fc" containerID="19cd6cacb21627411ce470cce820a1011a28e2c916fe55e239bdf878b282e0fe" exitCode=0 Apr 17 07:55:32.532984 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:32.532629 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" event={"ID":"f5648bed-82b5-4b80-8d08-2e781e7705fc","Type":"ContainerDied","Data":"19cd6cacb21627411ce470cce820a1011a28e2c916fe55e239bdf878b282e0fe"} Apr 17 07:55:32.532984 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:32.532941 2566 scope.go:117] "RemoveContainer" containerID="19cd6cacb21627411ce470cce820a1011a28e2c916fe55e239bdf878b282e0fe" Apr 17 07:55:33.536771 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:33.536732 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dxszq" event={"ID":"f5648bed-82b5-4b80-8d08-2e781e7705fc","Type":"ContainerStarted","Data":"4183c6a6ad4bbf80f43763ff9976c46bf59bf560487be606a611ca46654fddc1"} Apr 17 07:55:48.612107 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:48.612015 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:55:48.614381 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:48.614361 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4fbea707-d5c2-4c45-82e5-089d272aa922-metrics-certs\") pod \"network-metrics-daemon-jt4rj\" (UID: \"4fbea707-d5c2-4c45-82e5-089d272aa922\") " pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:55:48.707255 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:48.707225 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ctqdx\"" Apr 17 07:55:48.715401 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:48.715381 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jt4rj" Apr 17 07:55:48.835266 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:48.835233 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jt4rj"] Apr 17 07:55:48.838402 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:55:48.838359 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fbea707_d5c2_4c45_82e5_089d272aa922.slice/crio-8c382ec09fd1559c0bebd238a6a7b65e0f8b962efc91aa6225a63c3433c75285 WatchSource:0}: Error finding container 8c382ec09fd1559c0bebd238a6a7b65e0f8b962efc91aa6225a63c3433c75285: Status 404 returned error can't find the container with id 8c382ec09fd1559c0bebd238a6a7b65e0f8b962efc91aa6225a63c3433c75285 Apr 17 07:55:49.587210 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:49.587167 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jt4rj" event={"ID":"4fbea707-d5c2-4c45-82e5-089d272aa922","Type":"ContainerStarted","Data":"8c382ec09fd1559c0bebd238a6a7b65e0f8b962efc91aa6225a63c3433c75285"} Apr 17 07:55:50.598345 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:50.598308 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jt4rj" event={"ID":"4fbea707-d5c2-4c45-82e5-089d272aa922","Type":"ContainerStarted","Data":"8d7f536f66665e1cc7e01908a21ae2be1535f947e7f68229f369dd4c634d9b7a"} Apr 17 07:55:50.598345 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:50.598346 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jt4rj" event={"ID":"4fbea707-d5c2-4c45-82e5-089d272aa922","Type":"ContainerStarted","Data":"a8920f5db7d4b76eab1e07b84ce7755f3ce1ce74229941dc3038dc630388f079"} Apr 17 07:55:50.616528 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:55:50.616471 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jt4rj" podStartSLOduration=253.744432356 podStartE2EDuration="4m14.616453586s" podCreationTimestamp="2026-04-17 07:51:36 +0000 UTC" firstStartedPulling="2026-04-17 07:55:48.840253856 +0000 UTC m=+252.628032441" lastFinishedPulling="2026-04-17 07:55:49.712275086 +0000 UTC m=+253.500053671" observedRunningTime="2026-04-17 07:55:50.615181787 +0000 UTC m=+254.402960393" watchObservedRunningTime="2026-04-17 07:55:50.616453586 +0000 UTC m=+254.404232190" Apr 17 07:56:11.187723 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.187650 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f644f56d7-k6ckt"] Apr 17 07:56:11.188335 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.188107 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78985eab-173e-4c9e-82d1-2bc5d78fa58f" containerName="registry" Apr 17 07:56:11.188335 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.188127 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="78985eab-173e-4c9e-82d1-2bc5d78fa58f" containerName="registry" Apr 17 07:56:11.188335 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.188141 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc274a7b-7833-4adb-94c6-734b2ada29a4" containerName="console" Apr 17 07:56:11.188335 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.188149 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc274a7b-7833-4adb-94c6-734b2ada29a4" containerName="console" Apr 17 07:56:11.188335 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.188220 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="78985eab-173e-4c9e-82d1-2bc5d78fa58f" containerName="registry" Apr 17 07:56:11.188335 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.188234 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc274a7b-7833-4adb-94c6-734b2ada29a4" containerName="console" Apr 17 07:56:11.190890 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.190867 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.193376 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.193354 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 07:56:11.193677 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.193655 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 07:56:11.193828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.193802 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jmjjr\"" Apr 17 07:56:11.193950 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.193672 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 07:56:11.193950 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.193677 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 07:56:11.194071 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.193677 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 07:56:11.199416 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.199266 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 07:56:11.203247 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.203223 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f644f56d7-k6ckt"] Apr 17 07:56:11.295544 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.295514 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-oauth-config\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.295743 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.295564 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-trusted-ca-bundle\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.295743 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.295598 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-serving-cert\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.295743 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.295654 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-config\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.295743 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.295689 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-service-ca\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.295941 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.295746 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-oauth-serving-cert\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.295941 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.295768 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd7z2\" (UniqueName: \"kubernetes.io/projected/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-kube-api-access-sd7z2\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.396685 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.396649 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-config\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.396909 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.396721 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-service-ca\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.396909 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.396757 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-oauth-serving-cert\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.396909 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.396783 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd7z2\" (UniqueName: \"kubernetes.io/projected/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-kube-api-access-sd7z2\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.396909 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.396815 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-oauth-config\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.397135 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.396993 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-trusted-ca-bundle\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.397135 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.397043 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-serving-cert\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.397500 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.397471 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-config\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.397599 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.397485 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-oauth-serving-cert\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.397599 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.397485 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-service-ca\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.397783 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.397765 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-trusted-ca-bundle\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.399353 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.399324 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-serving-cert\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.399451 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.399370 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-oauth-config\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.404580 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.404563 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd7z2\" (UniqueName: \"kubernetes.io/projected/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-kube-api-access-sd7z2\") pod \"console-f644f56d7-k6ckt\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.502852 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.502764 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:11.630400 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.630367 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f644f56d7-k6ckt"] Apr 17 07:56:11.633487 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:56:11.633453 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1291c3_e5f4_4e41_80eb_87e90c8796d0.slice/crio-5af209709c73450a23d3690799b6011ab159627d52922af354e4f8f8df98eda9 WatchSource:0}: Error finding container 5af209709c73450a23d3690799b6011ab159627d52922af354e4f8f8df98eda9: Status 404 returned error can't find the container with id 5af209709c73450a23d3690799b6011ab159627d52922af354e4f8f8df98eda9 Apr 17 07:56:11.656392 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:11.656362 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f644f56d7-k6ckt" event={"ID":"9d1291c3-e5f4-4e41-80eb-87e90c8796d0","Type":"ContainerStarted","Data":"5af209709c73450a23d3690799b6011ab159627d52922af354e4f8f8df98eda9"} Apr 17 07:56:12.660229 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:12.660189 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f644f56d7-k6ckt" event={"ID":"9d1291c3-e5f4-4e41-80eb-87e90c8796d0","Type":"ContainerStarted","Data":"95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043"} Apr 17 07:56:21.502989 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:21.502947 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:21.503574 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:21.503001 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:21.507570 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:21.507547 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:21.525704 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:21.525660 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f644f56d7-k6ckt" podStartSLOduration=10.525647565 podStartE2EDuration="10.525647565s" podCreationTimestamp="2026-04-17 07:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:56:12.677595366 +0000 UTC m=+276.465373970" watchObservedRunningTime="2026-04-17 07:56:21.525647565 +0000 UTC m=+285.313426168" Apr 17 07:56:21.688880 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:21.688847 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 07:56:36.716286 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:36.716258 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 07:56:36.716286 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:36.716277 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 07:56:36.722394 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:36.722361 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 07:56:36.722692 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:36.722674 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 07:56:36.727543 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:56:36.727526 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 07:57:37.412825 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.412788 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v"] Apr 17 07:57:37.414938 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.414922 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" Apr 17 07:57:37.417729 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.417704 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 07:57:37.417847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.417775 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 07:57:37.417847 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.417780 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-8gq57\"" Apr 17 07:57:37.417951 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.417890 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 07:57:37.425999 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.425971 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v"] Apr 17 07:57:37.511323 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.511286 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wscv\" (UniqueName: \"kubernetes.io/projected/fc649813-d8ef-4eda-9f8b-88418b11054f-kube-api-access-8wscv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v\" (UID: \"fc649813-d8ef-4eda-9f8b-88418b11054f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" Apr 17 07:57:37.511500 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.511347 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/fc649813-d8ef-4eda-9f8b-88418b11054f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v\" (UID: \"fc649813-d8ef-4eda-9f8b-88418b11054f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" Apr 17 07:57:37.611795 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.611751 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/fc649813-d8ef-4eda-9f8b-88418b11054f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v\" (UID: \"fc649813-d8ef-4eda-9f8b-88418b11054f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" Apr 17 07:57:37.611973 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.611826 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wscv\" (UniqueName: \"kubernetes.io/projected/fc649813-d8ef-4eda-9f8b-88418b11054f-kube-api-access-8wscv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v\" (UID: \"fc649813-d8ef-4eda-9f8b-88418b11054f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" Apr 17 07:57:37.614117 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.614095 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/fc649813-d8ef-4eda-9f8b-88418b11054f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v\" (UID: \"fc649813-d8ef-4eda-9f8b-88418b11054f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" Apr 17 07:57:37.620471 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.620443 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wscv\" (UniqueName: \"kubernetes.io/projected/fc649813-d8ef-4eda-9f8b-88418b11054f-kube-api-access-8wscv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v\" (UID: \"fc649813-d8ef-4eda-9f8b-88418b11054f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" Apr 17 07:57:37.728113 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.728019 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" Apr 17 07:57:37.847103 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.847066 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v"] Apr 17 07:57:37.851193 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:57:37.851167 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc649813_d8ef_4eda_9f8b_88418b11054f.slice/crio-ec57042f7d417ba557559d43f44fae9669ba81dbc8a9692f554c6219a7ee9b61 WatchSource:0}: Error finding container ec57042f7d417ba557559d43f44fae9669ba81dbc8a9692f554c6219a7ee9b61: Status 404 returned error can't find the container with id ec57042f7d417ba557559d43f44fae9669ba81dbc8a9692f554c6219a7ee9b61 Apr 17 07:57:37.852875 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.852858 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:57:37.899072 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:37.899032 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" event={"ID":"fc649813-d8ef-4eda-9f8b-88418b11054f","Type":"ContainerStarted","Data":"ec57042f7d417ba557559d43f44fae9669ba81dbc8a9692f554c6219a7ee9b61"} Apr 17 07:57:41.735023 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.734989 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-62rjc"] Apr 17 07:57:41.738548 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.738531 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:41.739683 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.739660 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/75d997a9-a9b5-4aba-b602-b8c60a8b3696-cabundle0\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:41.739828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.739725 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kndq\" (UniqueName: \"kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-kube-api-access-7kndq\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:41.739828 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.739772 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:41.740961 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.740937 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 07:57:41.741074 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.740989 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 07:57:41.741074 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.741003 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-vr6k4\"" Apr 17 07:57:41.747144 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.747122 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-62rjc"] Apr 17 07:57:41.840941 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.840906 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/75d997a9-a9b5-4aba-b602-b8c60a8b3696-cabundle0\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:41.841117 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.840953 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kndq\" (UniqueName: \"kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-kube-api-access-7kndq\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:41.841117 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.840986 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:41.841229 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:41.841120 2566 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 17 07:57:41.841229 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:41.841134 2566 secret.go:281] references non-existent secret key: ca.crt Apr 17 07:57:41.841229 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:41.841141 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 07:57:41.841229 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:41.841153 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-62rjc: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 07:57:41.841229 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:41.841210 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates podName:75d997a9-a9b5-4aba-b602-b8c60a8b3696 nodeName:}" failed. No retries permitted until 2026-04-17 07:57:42.34119513 +0000 UTC m=+366.128973711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates") pod "keda-operator-ffbb595cb-62rjc" (UID: "75d997a9-a9b5-4aba-b602-b8c60a8b3696") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 07:57:41.841519 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.841499 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/75d997a9-a9b5-4aba-b602-b8c60a8b3696-cabundle0\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:41.849345 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.849312 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kndq\" (UniqueName: \"kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-kube-api-access-7kndq\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:41.914615 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.914580 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" event={"ID":"fc649813-d8ef-4eda-9f8b-88418b11054f","Type":"ContainerStarted","Data":"3ad8b24b8acf8769cf5ade13323cb84047b3fc067cf40064b6f3da027883277d"} Apr 17 07:57:41.914805 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.914671 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" Apr 17 07:57:41.933978 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.933924 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" podStartSLOduration=1.582861232 podStartE2EDuration="4.933904879s" podCreationTimestamp="2026-04-17 07:57:37 +0000 UTC" firstStartedPulling="2026-04-17 07:57:37.852983526 +0000 UTC m=+361.640762108" lastFinishedPulling="2026-04-17 07:57:41.204027166 +0000 UTC m=+364.991805755" observedRunningTime="2026-04-17 07:57:41.932052988 +0000 UTC m=+365.719831611" watchObservedRunningTime="2026-04-17 07:57:41.933904879 +0000 UTC m=+365.721683486" Apr 17 07:57:41.977069 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.977039 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk"] Apr 17 07:57:41.980620 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.980604 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:41.983062 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.983037 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 07:57:41.987088 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:41.987035 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk"] Apr 17 07:57:42.043567 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.043534 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:42.043753 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.043652 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8959b38c-a149-4695-91b9-43442fb518bf-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:42.043753 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.043714 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2mhh\" (UniqueName: \"kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-kube-api-access-n2mhh\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:42.144448 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.144406 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:42.144647 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.144493 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8959b38c-a149-4695-91b9-43442fb518bf-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:42.144647 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.144527 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2mhh\" (UniqueName: \"kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-kube-api-access-n2mhh\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:42.144647 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.144553 2566 secret.go:281] references non-existent secret key: tls.crt Apr 17 07:57:42.144647 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.144574 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 07:57:42.144647 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.144594 2566 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 17 07:57:42.144647 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.144615 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 07:57:42.144937 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.144679 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates podName:8959b38c-a149-4695-91b9-43442fb518bf nodeName:}" failed. No retries permitted until 2026-04-17 07:57:42.644660272 +0000 UTC m=+366.432438860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates") pod "keda-metrics-apiserver-7c9f485588-8qqfk" (UID: "8959b38c-a149-4695-91b9-43442fb518bf") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 07:57:42.144937 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.144927 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8959b38c-a149-4695-91b9-43442fb518bf-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:42.157872 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.157825 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2mhh\" (UniqueName: \"kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-kube-api-access-n2mhh\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:42.280273 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.280245 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-6xmzs"] Apr 17 07:57:42.283771 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.283755 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6xmzs" Apr 17 07:57:42.286187 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.286162 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 07:57:42.292764 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.292740 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6xmzs"] Apr 17 07:57:42.346807 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.346773 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7ngc\" (UniqueName: \"kubernetes.io/projected/c3be2097-1108-4024-9d1b-dc1dd9b321b4-kube-api-access-z7ngc\") pod \"keda-admission-cf49989db-6xmzs\" (UID: \"c3be2097-1108-4024-9d1b-dc1dd9b321b4\") " pod="openshift-keda/keda-admission-cf49989db-6xmzs" Apr 17 07:57:42.346971 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.346822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:42.346971 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.346853 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c3be2097-1108-4024-9d1b-dc1dd9b321b4-certificates\") pod \"keda-admission-cf49989db-6xmzs\" (UID: \"c3be2097-1108-4024-9d1b-dc1dd9b321b4\") " pod="openshift-keda/keda-admission-cf49989db-6xmzs" Apr 17 07:57:42.346971 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.346945 2566 secret.go:281] references non-existent secret key: ca.crt Apr 17 07:57:42.346971 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.346959 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 07:57:42.346971 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.346968 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-62rjc: references non-existent secret key: ca.crt Apr 17 07:57:42.347157 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.347014 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates podName:75d997a9-a9b5-4aba-b602-b8c60a8b3696 nodeName:}" failed. No retries permitted until 2026-04-17 07:57:43.347000206 +0000 UTC m=+367.134778792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates") pod "keda-operator-ffbb595cb-62rjc" (UID: "75d997a9-a9b5-4aba-b602-b8c60a8b3696") : references non-existent secret key: ca.crt Apr 17 07:57:42.448166 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.448124 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7ngc\" (UniqueName: \"kubernetes.io/projected/c3be2097-1108-4024-9d1b-dc1dd9b321b4-kube-api-access-z7ngc\") pod \"keda-admission-cf49989db-6xmzs\" (UID: \"c3be2097-1108-4024-9d1b-dc1dd9b321b4\") " pod="openshift-keda/keda-admission-cf49989db-6xmzs" Apr 17 07:57:42.448473 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.448454 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c3be2097-1108-4024-9d1b-dc1dd9b321b4-certificates\") pod \"keda-admission-cf49989db-6xmzs\" (UID: \"c3be2097-1108-4024-9d1b-dc1dd9b321b4\") " pod="openshift-keda/keda-admission-cf49989db-6xmzs" Apr 17 07:57:42.451956 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.451922 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c3be2097-1108-4024-9d1b-dc1dd9b321b4-certificates\") pod \"keda-admission-cf49989db-6xmzs\" (UID: \"c3be2097-1108-4024-9d1b-dc1dd9b321b4\") " pod="openshift-keda/keda-admission-cf49989db-6xmzs" Apr 17 07:57:42.455965 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.455939 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7ngc\" (UniqueName: \"kubernetes.io/projected/c3be2097-1108-4024-9d1b-dc1dd9b321b4-kube-api-access-z7ngc\") pod \"keda-admission-cf49989db-6xmzs\" (UID: \"c3be2097-1108-4024-9d1b-dc1dd9b321b4\") " pod="openshift-keda/keda-admission-cf49989db-6xmzs" Apr 17 07:57:42.594639 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.594546 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6xmzs" Apr 17 07:57:42.650561 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.649977 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:42.650561 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.650181 2566 secret.go:281] references non-existent secret key: tls.crt Apr 17 07:57:42.650561 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.650200 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 07:57:42.650561 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.650224 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk: references non-existent secret key: tls.crt Apr 17 07:57:42.650561 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:42.650290 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates podName:8959b38c-a149-4695-91b9-43442fb518bf nodeName:}" failed. No retries permitted until 2026-04-17 07:57:43.650270263 +0000 UTC m=+367.438048844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates") pod "keda-metrics-apiserver-7c9f485588-8qqfk" (UID: "8959b38c-a149-4695-91b9-43442fb518bf") : references non-existent secret key: tls.crt Apr 17 07:57:42.734517 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.734480 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6xmzs"] Apr 17 07:57:42.738225 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:57:42.738169 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3be2097_1108_4024_9d1b_dc1dd9b321b4.slice/crio-049212e039c28491c408f8b1abbc6c728428c097ae32935a86f6d30c75b58b58 WatchSource:0}: Error finding container 049212e039c28491c408f8b1abbc6c728428c097ae32935a86f6d30c75b58b58: Status 404 returned error can't find the container with id 049212e039c28491c408f8b1abbc6c728428c097ae32935a86f6d30c75b58b58 Apr 17 07:57:42.918908 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:42.918818 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6xmzs" event={"ID":"c3be2097-1108-4024-9d1b-dc1dd9b321b4","Type":"ContainerStarted","Data":"049212e039c28491c408f8b1abbc6c728428c097ae32935a86f6d30c75b58b58"} Apr 17 07:57:43.356638 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:43.356603 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:43.356857 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:43.356786 2566 secret.go:281] references non-existent secret key: ca.crt Apr 17 07:57:43.356857 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:43.356805 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 07:57:43.356857 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:43.356815 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-62rjc: references non-existent secret key: ca.crt Apr 17 07:57:43.356988 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:43.356871 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates podName:75d997a9-a9b5-4aba-b602-b8c60a8b3696 nodeName:}" failed. No retries permitted until 2026-04-17 07:57:45.356855035 +0000 UTC m=+369.144633625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates") pod "keda-operator-ffbb595cb-62rjc" (UID: "75d997a9-a9b5-4aba-b602-b8c60a8b3696") : references non-existent secret key: ca.crt Apr 17 07:57:43.660353 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:43.660270 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:43.660498 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:43.660451 2566 secret.go:281] references non-existent secret key: tls.crt Apr 17 07:57:43.660498 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:43.660476 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 07:57:43.660580 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:43.660502 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk: references non-existent secret key: tls.crt Apr 17 07:57:43.660580 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:43.660570 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates podName:8959b38c-a149-4695-91b9-43442fb518bf nodeName:}" failed. No retries permitted until 2026-04-17 07:57:45.66054934 +0000 UTC m=+369.448327926 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates") pod "keda-metrics-apiserver-7c9f485588-8qqfk" (UID: "8959b38c-a149-4695-91b9-43442fb518bf") : references non-existent secret key: tls.crt Apr 17 07:57:44.926315 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:44.926279 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6xmzs" event={"ID":"c3be2097-1108-4024-9d1b-dc1dd9b321b4","Type":"ContainerStarted","Data":"b40a963a78eaeb45aae048ee92fcbd9acbf39517c47d16169d0d3d7049f45632"} Apr 17 07:57:44.926776 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:44.926427 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-6xmzs" Apr 17 07:57:44.945345 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:44.945290 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-6xmzs" podStartSLOduration=1.721867461 podStartE2EDuration="2.945272428s" podCreationTimestamp="2026-04-17 07:57:42 +0000 UTC" firstStartedPulling="2026-04-17 07:57:42.74043586 +0000 UTC m=+366.528214453" lastFinishedPulling="2026-04-17 07:57:43.963840821 +0000 UTC m=+367.751619420" observedRunningTime="2026-04-17 07:57:44.945150433 +0000 UTC m=+368.732929037" watchObservedRunningTime="2026-04-17 07:57:44.945272428 +0000 UTC m=+368.733051032" Apr 17 07:57:45.375155 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:45.375115 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:45.375338 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:45.375256 2566 secret.go:281] references non-existent secret key: ca.crt Apr 17 07:57:45.375338 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:45.375278 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 07:57:45.375338 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:45.375288 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-62rjc: references non-existent secret key: ca.crt Apr 17 07:57:45.375338 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:45.375339 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates podName:75d997a9-a9b5-4aba-b602-b8c60a8b3696 nodeName:}" failed. No retries permitted until 2026-04-17 07:57:49.375324311 +0000 UTC m=+373.163102892 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates") pod "keda-operator-ffbb595cb-62rjc" (UID: "75d997a9-a9b5-4aba-b602-b8c60a8b3696") : references non-existent secret key: ca.crt Apr 17 07:57:45.677879 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:45.677782 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:45.678046 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:45.677907 2566 secret.go:281] references non-existent secret key: tls.crt Apr 17 07:57:45.678046 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:45.677923 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 07:57:45.678046 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:45.677940 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk: references non-existent secret key: tls.crt Apr 17 07:57:45.678046 ip-10-0-128-217 kubenswrapper[2566]: E0417 07:57:45.677990 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates podName:8959b38c-a149-4695-91b9-43442fb518bf nodeName:}" failed. No retries permitted until 2026-04-17 07:57:49.677975507 +0000 UTC m=+373.465754088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates") pod "keda-metrics-apiserver-7c9f485588-8qqfk" (UID: "8959b38c-a149-4695-91b9-43442fb518bf") : references non-existent secret key: tls.crt Apr 17 07:57:49.410638 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:49.410600 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:49.412991 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:49.412969 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/75d997a9-a9b5-4aba-b602-b8c60a8b3696-certificates\") pod \"keda-operator-ffbb595cb-62rjc\" (UID: \"75d997a9-a9b5-4aba-b602-b8c60a8b3696\") " pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:49.549553 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:49.549515 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:49.669115 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:49.669092 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-62rjc"] Apr 17 07:57:49.671135 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:57:49.671108 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75d997a9_a9b5_4aba_b602_b8c60a8b3696.slice/crio-af1615d4f12d24d2b4d60f3f5dc66e8209699a05d7c0d05d8173fa7d83582a27 WatchSource:0}: Error finding container af1615d4f12d24d2b4d60f3f5dc66e8209699a05d7c0d05d8173fa7d83582a27: Status 404 returned error can't find the container with id af1615d4f12d24d2b4d60f3f5dc66e8209699a05d7c0d05d8173fa7d83582a27 Apr 17 07:57:49.714383 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:49.714354 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:49.716852 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:49.716831 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8959b38c-a149-4695-91b9-43442fb518bf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8qqfk\" (UID: \"8959b38c-a149-4695-91b9-43442fb518bf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:49.792595 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:49.792556 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:49.910825 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:49.910796 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk"] Apr 17 07:57:49.913222 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:57:49.913195 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8959b38c_a149_4695_91b9_43442fb518bf.slice/crio-79f8b9da0426999590517ac7ea4b5ae65dcf9c5fc84eb252807796d0b20ea856 WatchSource:0}: Error finding container 79f8b9da0426999590517ac7ea4b5ae65dcf9c5fc84eb252807796d0b20ea856: Status 404 returned error can't find the container with id 79f8b9da0426999590517ac7ea4b5ae65dcf9c5fc84eb252807796d0b20ea856 Apr 17 07:57:49.950330 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:49.950239 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-62rjc" event={"ID":"75d997a9-a9b5-4aba-b602-b8c60a8b3696","Type":"ContainerStarted","Data":"af1615d4f12d24d2b4d60f3f5dc66e8209699a05d7c0d05d8173fa7d83582a27"} Apr 17 07:57:49.951192 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:49.951165 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" event={"ID":"8959b38c-a149-4695-91b9-43442fb518bf","Type":"ContainerStarted","Data":"79f8b9da0426999590517ac7ea4b5ae65dcf9c5fc84eb252807796d0b20ea856"} Apr 17 07:57:53.966941 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:53.966838 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-62rjc" event={"ID":"75d997a9-a9b5-4aba-b602-b8c60a8b3696","Type":"ContainerStarted","Data":"84d5da12e01e452e26c34b463e065abfa770e0a3cff2a1e0d504b75273ecb28d"} Apr 17 07:57:53.967379 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:53.966981 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:57:53.968152 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:53.968127 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" event={"ID":"8959b38c-a149-4695-91b9-43442fb518bf","Type":"ContainerStarted","Data":"6275decd9bbb0d360f8fee98dbcf517afff74a56aa9c50d0a9e22e002cdb55f7"} Apr 17 07:57:53.968299 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:53.968242 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:57:53.982971 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:53.982929 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-62rjc" podStartSLOduration=9.05497737 podStartE2EDuration="12.982916792s" podCreationTimestamp="2026-04-17 07:57:41 +0000 UTC" firstStartedPulling="2026-04-17 07:57:49.672339307 +0000 UTC m=+373.460117891" lastFinishedPulling="2026-04-17 07:57:53.600278733 +0000 UTC m=+377.388057313" observedRunningTime="2026-04-17 07:57:53.981613623 +0000 UTC m=+377.769392225" watchObservedRunningTime="2026-04-17 07:57:53.982916792 +0000 UTC m=+377.770695394" Apr 17 07:57:53.997632 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:57:53.997579 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" podStartSLOduration=9.311091524 podStartE2EDuration="12.997565768s" podCreationTimestamp="2026-04-17 07:57:41 +0000 UTC" firstStartedPulling="2026-04-17 07:57:49.914546053 +0000 UTC m=+373.702324634" lastFinishedPulling="2026-04-17 07:57:53.601020283 +0000 UTC m=+377.388798878" observedRunningTime="2026-04-17 07:57:53.995758136 +0000 UTC m=+377.783536736" watchObservedRunningTime="2026-04-17 07:57:53.997565768 +0000 UTC m=+377.785344376" Apr 17 07:58:02.921388 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:58:02.921352 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-q7q5v" Apr 17 07:58:04.976020 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:58:04.975986 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8qqfk" Apr 17 07:58:05.931506 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:58:05.931474 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-6xmzs" Apr 17 07:58:14.973932 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:58:14.973897 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-62rjc" Apr 17 07:59:05.557964 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.557864 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz"] Apr 17 07:59:05.562516 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.562486 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" Apr 17 07:59:05.566453 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.566427 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 07:59:05.566589 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.566553 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-ghqj4\"" Apr 17 07:59:05.566742 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.566723 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:59:05.567475 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.567452 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz"] Apr 17 07:59:05.652909 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.652870 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgb6v\" (UniqueName: \"kubernetes.io/projected/84f87d67-a0cd-4783-ae58-8c3cf22700be-kube-api-access-qgb6v\") pod \"openshift-lws-operator-bfc7f696d-bq9gz\" (UID: \"84f87d67-a0cd-4783-ae58-8c3cf22700be\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" Apr 17 07:59:05.653079 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.652918 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84f87d67-a0cd-4783-ae58-8c3cf22700be-tmp\") pod \"openshift-lws-operator-bfc7f696d-bq9gz\" (UID: \"84f87d67-a0cd-4783-ae58-8c3cf22700be\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" Apr 17 07:59:05.753943 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.753904 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84f87d67-a0cd-4783-ae58-8c3cf22700be-tmp\") pod \"openshift-lws-operator-bfc7f696d-bq9gz\" (UID: \"84f87d67-a0cd-4783-ae58-8c3cf22700be\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" Apr 17 07:59:05.754107 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.754017 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgb6v\" (UniqueName: \"kubernetes.io/projected/84f87d67-a0cd-4783-ae58-8c3cf22700be-kube-api-access-qgb6v\") pod \"openshift-lws-operator-bfc7f696d-bq9gz\" (UID: \"84f87d67-a0cd-4783-ae58-8c3cf22700be\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" Apr 17 07:59:05.754379 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.754357 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84f87d67-a0cd-4783-ae58-8c3cf22700be-tmp\") pod \"openshift-lws-operator-bfc7f696d-bq9gz\" (UID: \"84f87d67-a0cd-4783-ae58-8c3cf22700be\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" Apr 17 07:59:05.762378 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.762345 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgb6v\" (UniqueName: \"kubernetes.io/projected/84f87d67-a0cd-4783-ae58-8c3cf22700be-kube-api-access-qgb6v\") pod \"openshift-lws-operator-bfc7f696d-bq9gz\" (UID: \"84f87d67-a0cd-4783-ae58-8c3cf22700be\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" Apr 17 07:59:05.887845 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:05.887759 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" Apr 17 07:59:06.005547 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:06.005466 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz"] Apr 17 07:59:06.008110 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:59:06.008081 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f87d67_a0cd_4783_ae58_8c3cf22700be.slice/crio-4818c05159813a0f2a53a39381c4b5646fd4227002aa800a853bb28dd9bce834 WatchSource:0}: Error finding container 4818c05159813a0f2a53a39381c4b5646fd4227002aa800a853bb28dd9bce834: Status 404 returned error can't find the container with id 4818c05159813a0f2a53a39381c4b5646fd4227002aa800a853bb28dd9bce834 Apr 17 07:59:06.196389 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:06.196301 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" event={"ID":"84f87d67-a0cd-4783-ae58-8c3cf22700be","Type":"ContainerStarted","Data":"4818c05159813a0f2a53a39381c4b5646fd4227002aa800a853bb28dd9bce834"} Apr 17 07:59:09.207616 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:09.207580 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" event={"ID":"84f87d67-a0cd-4783-ae58-8c3cf22700be","Type":"ContainerStarted","Data":"7ede0757049ed06babc827c3e4d058f5c497858491726e341a1af5162a277bdb"} Apr 17 07:59:09.223841 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:09.223793 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bq9gz" podStartSLOduration=2.003029644 podStartE2EDuration="4.223777847s" podCreationTimestamp="2026-04-17 07:59:05 +0000 UTC" firstStartedPulling="2026-04-17 07:59:06.009931131 +0000 UTC m=+449.797709711" lastFinishedPulling="2026-04-17 07:59:08.230679333 +0000 UTC m=+452.018457914" observedRunningTime="2026-04-17 07:59:09.222639892 +0000 UTC m=+453.010418496" watchObservedRunningTime="2026-04-17 07:59:09.223777847 +0000 UTC m=+453.011556447" Apr 17 07:59:35.702241 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.702202 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk"] Apr 17 07:59:35.705715 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.705678 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" Apr 17 07:59:35.708985 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.708961 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-kjjgn\"" Apr 17 07:59:35.709154 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.709134 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 07:59:35.709218 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.709181 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 07:59:35.709421 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.709396 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/789254bc-da9e-4afb-b2d3-1e7461348275-operator-config\") pod \"servicemesh-operator3-55f49c5f94-6n8qk\" (UID: \"789254bc-da9e-4afb-b2d3-1e7461348275\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" Apr 17 07:59:35.709483 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.709441 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcxnb\" (UniqueName: \"kubernetes.io/projected/789254bc-da9e-4afb-b2d3-1e7461348275-kube-api-access-gcxnb\") pod \"servicemesh-operator3-55f49c5f94-6n8qk\" (UID: \"789254bc-da9e-4afb-b2d3-1e7461348275\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" Apr 17 07:59:35.721539 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.721517 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk"] Apr 17 07:59:35.809983 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.809953 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/789254bc-da9e-4afb-b2d3-1e7461348275-operator-config\") pod \"servicemesh-operator3-55f49c5f94-6n8qk\" (UID: \"789254bc-da9e-4afb-b2d3-1e7461348275\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" Apr 17 07:59:35.810160 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.809991 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcxnb\" (UniqueName: \"kubernetes.io/projected/789254bc-da9e-4afb-b2d3-1e7461348275-kube-api-access-gcxnb\") pod \"servicemesh-operator3-55f49c5f94-6n8qk\" (UID: \"789254bc-da9e-4afb-b2d3-1e7461348275\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" Apr 17 07:59:35.812476 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.812446 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/789254bc-da9e-4afb-b2d3-1e7461348275-operator-config\") pod \"servicemesh-operator3-55f49c5f94-6n8qk\" (UID: \"789254bc-da9e-4afb-b2d3-1e7461348275\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" Apr 17 07:59:35.818810 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:35.818788 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcxnb\" (UniqueName: \"kubernetes.io/projected/789254bc-da9e-4afb-b2d3-1e7461348275-kube-api-access-gcxnb\") pod \"servicemesh-operator3-55f49c5f94-6n8qk\" (UID: \"789254bc-da9e-4afb-b2d3-1e7461348275\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" Apr 17 07:59:36.016044 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:36.015955 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" Apr 17 07:59:36.147518 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:36.147496 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk"] Apr 17 07:59:36.149760 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:59:36.149727 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod789254bc_da9e_4afb_b2d3_1e7461348275.slice/crio-f49d5d96e88447b622e1f47cad8f3b11812ecb2f0938f856b89135ad7150bc6b WatchSource:0}: Error finding container f49d5d96e88447b622e1f47cad8f3b11812ecb2f0938f856b89135ad7150bc6b: Status 404 returned error can't find the container with id f49d5d96e88447b622e1f47cad8f3b11812ecb2f0938f856b89135ad7150bc6b Apr 17 07:59:36.297253 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:36.297218 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" event={"ID":"789254bc-da9e-4afb-b2d3-1e7461348275","Type":"ContainerStarted","Data":"f49d5d96e88447b622e1f47cad8f3b11812ecb2f0938f856b89135ad7150bc6b"} Apr 17 07:59:39.309586 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:39.309545 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" event={"ID":"789254bc-da9e-4afb-b2d3-1e7461348275","Type":"ContainerStarted","Data":"ccf44a3378177c214ec9d03b0c07185ccd10c608ce4220406cc63520d5ec0669"} Apr 17 07:59:39.309972 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:39.309678 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" Apr 17 07:59:39.333615 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:39.333562 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" podStartSLOduration=1.705546424 podStartE2EDuration="4.333543743s" podCreationTimestamp="2026-04-17 07:59:35 +0000 UTC" firstStartedPulling="2026-04-17 07:59:36.152102919 +0000 UTC m=+479.939881500" lastFinishedPulling="2026-04-17 07:59:38.780100234 +0000 UTC m=+482.567878819" observedRunningTime="2026-04-17 07:59:39.331543886 +0000 UTC m=+483.119322489" watchObservedRunningTime="2026-04-17 07:59:39.333543743 +0000 UTC m=+483.121322346" Apr 17 07:59:50.315360 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:50.315327 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-6n8qk" Apr 17 07:59:51.054383 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.054351 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7"] Apr 17 07:59:51.057517 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.057497 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.060167 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.060143 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 07:59:51.064719 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.061193 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 07:59:51.064719 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.061544 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-jgrf9\"" Apr 17 07:59:51.064719 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.061735 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 07:59:51.064719 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.062238 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 07:59:51.074003 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.073980 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7"] Apr 17 07:59:51.128638 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.128602 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrcz9\" (UniqueName: \"kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-kube-api-access-qrcz9\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.128814 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.128653 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e99ba96d-63d9-442e-9663-99434065a88b-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.128814 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.128718 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.128814 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.128750 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.128814 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.128797 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.128948 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.128852 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.128948 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.128898 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.230085 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.230046 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.230085 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.230092 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.230341 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.230214 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrcz9\" (UniqueName: \"kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-kube-api-access-qrcz9\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.230341 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.230275 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e99ba96d-63d9-442e-9663-99434065a88b-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.230504 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.230472 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.230681 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.230530 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.230681 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.230581 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.231206 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.231180 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.232565 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.232544 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e99ba96d-63d9-442e-9663-99434065a88b-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.232778 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.232757 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.232992 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.232973 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.233054 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.232986 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.239736 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.239688 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.240120 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.240100 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrcz9\" (UniqueName: \"kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-kube-api-access-qrcz9\") pod \"istiod-openshift-gateway-7cd77c7ffd-2wsk7\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.369196 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.369098 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:51.515500 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:51.515458 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7"] Apr 17 07:59:52.353796 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:52.353732 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" event={"ID":"e99ba96d-63d9-442e-9663-99434065a88b","Type":"ContainerStarted","Data":"6bb9efd033d79fc54a35468776a7d4d12d7096a2246d9f8679b7777e230d3e4f"} Apr 17 07:59:53.982931 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:53.982893 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 07:59:53.983207 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:53.982960 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 07:59:54.362617 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:54.362582 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" event={"ID":"e99ba96d-63d9-442e-9663-99434065a88b","Type":"ContainerStarted","Data":"ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2"} Apr 17 07:59:54.362794 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:54.362692 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:54.383142 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:54.383082 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" podStartSLOduration=0.923012666 podStartE2EDuration="3.383064161s" podCreationTimestamp="2026-04-17 07:59:51 +0000 UTC" firstStartedPulling="2026-04-17 07:59:51.522587735 +0000 UTC m=+495.310366315" lastFinishedPulling="2026-04-17 07:59:53.98263923 +0000 UTC m=+497.770417810" observedRunningTime="2026-04-17 07:59:54.381733987 +0000 UTC m=+498.169512592" watchObservedRunningTime="2026-04-17 07:59:54.383064161 +0000 UTC m=+498.170842766" Apr 17 07:59:55.368060 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:55.368028 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 07:59:58.090851 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.090808 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t"] Apr 17 07:59:58.094525 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.094503 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.098918 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.098891 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-zbtgn\"" Apr 17 07:59:58.141344 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.141305 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t"] Apr 17 07:59:58.197942 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.197897 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.197942 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.197945 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.198190 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.197965 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.198190 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.198031 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.198190 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.198070 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b23ca74c-0031-4851-93e2-a6d737874ca7-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.198190 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.198092 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.198190 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.198111 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.198190 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.198132 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.198190 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.198153 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkvdp\" (UniqueName: \"kubernetes.io/projected/b23ca74c-0031-4851-93e2-a6d737874ca7-kube-api-access-lkvdp\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.298762 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.298728 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b23ca74c-0031-4851-93e2-a6d737874ca7-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.298762 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.298768 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.299013 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.298793 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.299013 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.298823 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.299013 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.298842 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkvdp\" (UniqueName: \"kubernetes.io/projected/b23ca74c-0031-4851-93e2-a6d737874ca7-kube-api-access-lkvdp\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.299013 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.298896 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.299013 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.298928 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.299013 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.298956 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.299316 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.299038 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.299412 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.299387 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.300025 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.299763 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.300025 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.299841 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.300025 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.299495 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b23ca74c-0031-4851-93e2-a6d737874ca7-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.300341 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.300318 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.302889 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.302842 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.303304 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.303281 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.308007 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.307984 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b23ca74c-0031-4851-93e2-a6d737874ca7-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.308501 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.308477 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkvdp\" (UniqueName: \"kubernetes.io/projected/b23ca74c-0031-4851-93e2-a6d737874ca7-kube-api-access-lkvdp\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-22t5t\" (UID: \"b23ca74c-0031-4851-93e2-a6d737874ca7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.407093 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.406997 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 07:59:58.532365 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:58.532338 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t"] Apr 17 07:59:58.534993 ip-10-0-128-217 kubenswrapper[2566]: W0417 07:59:58.534955 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb23ca74c_0031_4851_93e2_a6d737874ca7.slice/crio-981d870381a95295e273e6413edf437db27dc3556a020cd1cc5c3a5085577fce WatchSource:0}: Error finding container 981d870381a95295e273e6413edf437db27dc3556a020cd1cc5c3a5085577fce: Status 404 returned error can't find the container with id 981d870381a95295e273e6413edf437db27dc3556a020cd1cc5c3a5085577fce Apr 17 07:59:59.384320 ip-10-0-128-217 kubenswrapper[2566]: I0417 07:59:59.384284 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" event={"ID":"b23ca74c-0031-4851-93e2-a6d737874ca7","Type":"ContainerStarted","Data":"981d870381a95295e273e6413edf437db27dc3556a020cd1cc5c3a5085577fce"} Apr 17 08:00:01.269142 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:01.269106 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 08:00:01.269410 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:01.269180 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 08:00:01.269410 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:01.269231 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 08:00:01.404541 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:01.404502 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" event={"ID":"b23ca74c-0031-4851-93e2-a6d737874ca7","Type":"ContainerStarted","Data":"c50769e77ce308c65d45c914a2d31af7a76ce5676b6671b5f367b47b7cd01e17"} Apr 17 08:00:01.407857 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:01.407822 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 08:00:01.426198 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:01.426147 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" podStartSLOduration=0.694687014 podStartE2EDuration="3.426130569s" podCreationTimestamp="2026-04-17 07:59:58 +0000 UTC" firstStartedPulling="2026-04-17 07:59:58.537416518 +0000 UTC m=+502.325195098" lastFinishedPulling="2026-04-17 08:00:01.268860072 +0000 UTC m=+505.056638653" observedRunningTime="2026-04-17 08:00:01.423675296 +0000 UTC m=+505.211453904" watchObservedRunningTime="2026-04-17 08:00:01.426130569 +0000 UTC m=+505.213909172" Apr 17 08:00:02.411778 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:02.411752 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 08:00:03.411392 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:03.411358 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 08:00:03.412307 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:03.412286 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-22t5t" Apr 17 08:00:06.694670 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.694635 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-859974d654-wrtjn"] Apr 17 08:00:06.698393 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.698359 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.709856 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.709829 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-859974d654-wrtjn"] Apr 17 08:00:06.776636 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.776598 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-trusted-ca-bundle\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.776827 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.776649 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-service-ca\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.776827 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.776714 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70d0b567-6353-4d11-b0e2-d07d0a107d59-console-oauth-config\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.776827 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.776798 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-oauth-serving-cert\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.776985 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.776837 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-console-config\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.776985 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.776861 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8brmc\" (UniqueName: \"kubernetes.io/projected/70d0b567-6353-4d11-b0e2-d07d0a107d59-kube-api-access-8brmc\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.776985 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.776905 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70d0b567-6353-4d11-b0e2-d07d0a107d59-console-serving-cert\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.877638 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.877601 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70d0b567-6353-4d11-b0e2-d07d0a107d59-console-oauth-config\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.877852 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.877653 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-oauth-serving-cert\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.877852 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.877681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-console-config\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.877852 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.877730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8brmc\" (UniqueName: \"kubernetes.io/projected/70d0b567-6353-4d11-b0e2-d07d0a107d59-kube-api-access-8brmc\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.877852 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.877746 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70d0b567-6353-4d11-b0e2-d07d0a107d59-console-serving-cert\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.877852 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.877797 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-trusted-ca-bundle\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.878209 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.878182 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-service-ca\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.878507 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.878485 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-oauth-serving-cert\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.878583 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.878516 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-console-config\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.878630 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.878597 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-trusted-ca-bundle\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.878918 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.878894 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70d0b567-6353-4d11-b0e2-d07d0a107d59-service-ca\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.880303 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.880272 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/70d0b567-6353-4d11-b0e2-d07d0a107d59-console-serving-cert\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.880392 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.880317 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/70d0b567-6353-4d11-b0e2-d07d0a107d59-console-oauth-config\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:06.885972 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:06.885952 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8brmc\" (UniqueName: \"kubernetes.io/projected/70d0b567-6353-4d11-b0e2-d07d0a107d59-kube-api-access-8brmc\") pod \"console-859974d654-wrtjn\" (UID: \"70d0b567-6353-4d11-b0e2-d07d0a107d59\") " pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:07.009403 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:07.009296 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:07.133053 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:07.133007 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-859974d654-wrtjn"] Apr 17 08:00:07.135425 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:00:07.135396 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70d0b567_6353_4d11_b0e2_d07d0a107d59.slice/crio-dbf78e7a3e351c31491dd886e4bbde6481731da4353d2ffb296bcda78c9ef129 WatchSource:0}: Error finding container dbf78e7a3e351c31491dd886e4bbde6481731da4353d2ffb296bcda78c9ef129: Status 404 returned error can't find the container with id dbf78e7a3e351c31491dd886e4bbde6481731da4353d2ffb296bcda78c9ef129 Apr 17 08:00:07.426865 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:07.426830 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859974d654-wrtjn" event={"ID":"70d0b567-6353-4d11-b0e2-d07d0a107d59","Type":"ContainerStarted","Data":"441f06fa5a04f4b2abd9504fa541d0ac0a663562043c148c930eeacb1689e8c7"} Apr 17 08:00:07.426865 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:07.426865 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859974d654-wrtjn" event={"ID":"70d0b567-6353-4d11-b0e2-d07d0a107d59","Type":"ContainerStarted","Data":"dbf78e7a3e351c31491dd886e4bbde6481731da4353d2ffb296bcda78c9ef129"} Apr 17 08:00:07.446067 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:07.446015 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-859974d654-wrtjn" podStartSLOduration=1.445998815 podStartE2EDuration="1.445998815s" podCreationTimestamp="2026-04-17 08:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:00:07.444500208 +0000 UTC m=+511.232278812" watchObservedRunningTime="2026-04-17 08:00:07.445998815 +0000 UTC m=+511.233777418" Apr 17 08:00:17.010411 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:17.010377 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:17.010834 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:17.010537 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:17.015438 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:17.015413 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:17.465182 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:17.465152 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-859974d654-wrtjn" Apr 17 08:00:17.508468 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:17.508424 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f644f56d7-k6ckt"] Apr 17 08:00:26.614849 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.614813 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7"] Apr 17 08:00:26.623771 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.623750 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" Apr 17 08:00:26.626675 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.626649 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 08:00:26.627772 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.627612 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-n76h6\"" Apr 17 08:00:26.627772 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.627635 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 08:00:26.628778 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.628755 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7"] Apr 17 08:00:26.754235 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.754181 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d625\" (UniqueName: \"kubernetes.io/projected/5924881c-7eaf-4d8e-8d87-5c56cda01132-kube-api-access-2d625\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-shxl7\" (UID: \"5924881c-7eaf-4d8e-8d87-5c56cda01132\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" Apr 17 08:00:26.754423 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.754264 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5924881c-7eaf-4d8e-8d87-5c56cda01132-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-shxl7\" (UID: \"5924881c-7eaf-4d8e-8d87-5c56cda01132\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" Apr 17 08:00:26.854765 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.854730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5924881c-7eaf-4d8e-8d87-5c56cda01132-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-shxl7\" (UID: \"5924881c-7eaf-4d8e-8d87-5c56cda01132\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" Apr 17 08:00:26.854950 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.854814 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d625\" (UniqueName: \"kubernetes.io/projected/5924881c-7eaf-4d8e-8d87-5c56cda01132-kube-api-access-2d625\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-shxl7\" (UID: \"5924881c-7eaf-4d8e-8d87-5c56cda01132\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" Apr 17 08:00:26.855171 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.855147 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5924881c-7eaf-4d8e-8d87-5c56cda01132-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-shxl7\" (UID: \"5924881c-7eaf-4d8e-8d87-5c56cda01132\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" Apr 17 08:00:26.863467 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.863441 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d625\" (UniqueName: \"kubernetes.io/projected/5924881c-7eaf-4d8e-8d87-5c56cda01132-kube-api-access-2d625\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-shxl7\" (UID: \"5924881c-7eaf-4d8e-8d87-5c56cda01132\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" Apr 17 08:00:26.935983 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:26.935895 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" Apr 17 08:00:27.098396 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:27.098354 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7"] Apr 17 08:00:27.101084 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:00:27.101053 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5924881c_7eaf_4d8e_8d87_5c56cda01132.slice/crio-a26cb6e732dfe83398f6c3cfdd2554fc9e343a46c760cc33d88b3280ee7d44c0 WatchSource:0}: Error finding container a26cb6e732dfe83398f6c3cfdd2554fc9e343a46c760cc33d88b3280ee7d44c0: Status 404 returned error can't find the container with id a26cb6e732dfe83398f6c3cfdd2554fc9e343a46c760cc33d88b3280ee7d44c0 Apr 17 08:00:27.499674 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:27.499637 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" event={"ID":"5924881c-7eaf-4d8e-8d87-5c56cda01132","Type":"ContainerStarted","Data":"a26cb6e732dfe83398f6c3cfdd2554fc9e343a46c760cc33d88b3280ee7d44c0"} Apr 17 08:00:33.525328 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:33.525278 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" event={"ID":"5924881c-7eaf-4d8e-8d87-5c56cda01132","Type":"ContainerStarted","Data":"e3c8367694455f981d82390d5dc43eac6621f0eab7417d5c313919d2d9bf5202"} Apr 17 08:00:33.525328 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:33.525328 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" Apr 17 08:00:33.551925 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:33.551872 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" podStartSLOduration=2.035810317 podStartE2EDuration="7.551842222s" podCreationTimestamp="2026-04-17 08:00:26 +0000 UTC" firstStartedPulling="2026-04-17 08:00:27.103410521 +0000 UTC m=+530.891189102" lastFinishedPulling="2026-04-17 08:00:32.619442426 +0000 UTC m=+536.407221007" observedRunningTime="2026-04-17 08:00:33.551319373 +0000 UTC m=+537.339097966" watchObservedRunningTime="2026-04-17 08:00:33.551842222 +0000 UTC m=+537.339620826" Apr 17 08:00:35.169494 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.169453 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n"] Apr 17 08:00:35.173224 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.173194 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.175618 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.175590 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 08:00:35.175618 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.175593 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 08:00:35.175932 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.175917 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jxfvx\"" Apr 17 08:00:35.181918 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.181885 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n"] Apr 17 08:00:35.337056 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.337020 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2ce257-5495-4514-a17d-3c85c0dcb68f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-dcs2n\" (UID: \"fd2ce257-5495-4514-a17d-3c85c0dcb68f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.337214 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.337095 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fd2ce257-5495-4514-a17d-3c85c0dcb68f-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-dcs2n\" (UID: \"fd2ce257-5495-4514-a17d-3c85c0dcb68f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.337214 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.337136 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq9gr\" (UniqueName: \"kubernetes.io/projected/fd2ce257-5495-4514-a17d-3c85c0dcb68f-kube-api-access-gq9gr\") pod \"kuadrant-console-plugin-6c886788f8-dcs2n\" (UID: \"fd2ce257-5495-4514-a17d-3c85c0dcb68f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.438380 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.438289 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fd2ce257-5495-4514-a17d-3c85c0dcb68f-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-dcs2n\" (UID: \"fd2ce257-5495-4514-a17d-3c85c0dcb68f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.438380 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.438363 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gq9gr\" (UniqueName: \"kubernetes.io/projected/fd2ce257-5495-4514-a17d-3c85c0dcb68f-kube-api-access-gq9gr\") pod \"kuadrant-console-plugin-6c886788f8-dcs2n\" (UID: \"fd2ce257-5495-4514-a17d-3c85c0dcb68f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.438605 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.438443 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2ce257-5495-4514-a17d-3c85c0dcb68f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-dcs2n\" (UID: \"fd2ce257-5495-4514-a17d-3c85c0dcb68f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.439083 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.439055 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fd2ce257-5495-4514-a17d-3c85c0dcb68f-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-dcs2n\" (UID: \"fd2ce257-5495-4514-a17d-3c85c0dcb68f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.440879 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.440859 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2ce257-5495-4514-a17d-3c85c0dcb68f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-dcs2n\" (UID: \"fd2ce257-5495-4514-a17d-3c85c0dcb68f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.447311 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.447284 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq9gr\" (UniqueName: \"kubernetes.io/projected/fd2ce257-5495-4514-a17d-3c85c0dcb68f-kube-api-access-gq9gr\") pod \"kuadrant-console-plugin-6c886788f8-dcs2n\" (UID: \"fd2ce257-5495-4514-a17d-3c85c0dcb68f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.481977 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.481938 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" Apr 17 08:00:35.611082 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:35.611055 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n"] Apr 17 08:00:35.613807 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:00:35.613779 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd2ce257_5495_4514_a17d_3c85c0dcb68f.slice/crio-f8e0e63a16d492e2b653f98deabe415bb1fa9d45e15ead5abaa001de7f190c00 WatchSource:0}: Error finding container f8e0e63a16d492e2b653f98deabe415bb1fa9d45e15ead5abaa001de7f190c00: Status 404 returned error can't find the container with id f8e0e63a16d492e2b653f98deabe415bb1fa9d45e15ead5abaa001de7f190c00 Apr 17 08:00:36.537044 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:36.537004 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" event={"ID":"fd2ce257-5495-4514-a17d-3c85c0dcb68f","Type":"ContainerStarted","Data":"f8e0e63a16d492e2b653f98deabe415bb1fa9d45e15ead5abaa001de7f190c00"} Apr 17 08:00:40.558665 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:40.558615 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" event={"ID":"fd2ce257-5495-4514-a17d-3c85c0dcb68f","Type":"ContainerStarted","Data":"6018bbd7dee0b5c34ca00b55dad56acb90872342125583cae902c8b604b070ed"} Apr 17 08:00:40.576471 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:40.576418 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-dcs2n" podStartSLOduration=0.8604326 podStartE2EDuration="5.576402904s" podCreationTimestamp="2026-04-17 08:00:35 +0000 UTC" firstStartedPulling="2026-04-17 08:00:35.615153002 +0000 UTC m=+539.402931587" lastFinishedPulling="2026-04-17 08:00:40.331123304 +0000 UTC m=+544.118901891" observedRunningTime="2026-04-17 08:00:40.574482693 +0000 UTC m=+544.362261321" watchObservedRunningTime="2026-04-17 08:00:40.576402904 +0000 UTC m=+544.364181507" Apr 17 08:00:42.530785 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.530719 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f644f56d7-k6ckt" podUID="9d1291c3-e5f4-4e41-80eb-87e90c8796d0" containerName="console" containerID="cri-o://95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043" gracePeriod=15 Apr 17 08:00:42.776162 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.776137 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f644f56d7-k6ckt_9d1291c3-e5f4-4e41-80eb-87e90c8796d0/console/0.log" Apr 17 08:00:42.776296 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.776199 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 08:00:42.810420 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.810335 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-serving-cert\") pod \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " Apr 17 08:00:42.810420 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.810405 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-trusted-ca-bundle\") pod \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " Apr 17 08:00:42.810626 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.810552 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-oauth-serving-cert\") pod \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " Apr 17 08:00:42.810691 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.810623 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-config\") pod \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " Apr 17 08:00:42.810691 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.810659 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-service-ca\") pod \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " Apr 17 08:00:42.810811 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.810723 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-oauth-config\") pod \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " Apr 17 08:00:42.810811 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.810761 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd7z2\" (UniqueName: \"kubernetes.io/projected/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-kube-api-access-sd7z2\") pod \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\" (UID: \"9d1291c3-e5f4-4e41-80eb-87e90c8796d0\") " Apr 17 08:00:42.811064 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.810922 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9d1291c3-e5f4-4e41-80eb-87e90c8796d0" (UID: "9d1291c3-e5f4-4e41-80eb-87e90c8796d0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:42.811064 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.810998 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-config" (OuterVolumeSpecName: "console-config") pod "9d1291c3-e5f4-4e41-80eb-87e90c8796d0" (UID: "9d1291c3-e5f4-4e41-80eb-87e90c8796d0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:42.811064 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.811029 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-service-ca" (OuterVolumeSpecName: "service-ca") pod "9d1291c3-e5f4-4e41-80eb-87e90c8796d0" (UID: "9d1291c3-e5f4-4e41-80eb-87e90c8796d0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:42.811262 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.811108 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-oauth-serving-cert\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:00:42.811262 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.811125 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:00:42.811262 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.811168 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9d1291c3-e5f4-4e41-80eb-87e90c8796d0" (UID: "9d1291c3-e5f4-4e41-80eb-87e90c8796d0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:42.812690 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.812668 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9d1291c3-e5f4-4e41-80eb-87e90c8796d0" (UID: "9d1291c3-e5f4-4e41-80eb-87e90c8796d0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:00:42.813369 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.813346 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9d1291c3-e5f4-4e41-80eb-87e90c8796d0" (UID: "9d1291c3-e5f4-4e41-80eb-87e90c8796d0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:00:42.813445 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.813350 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-kube-api-access-sd7z2" (OuterVolumeSpecName: "kube-api-access-sd7z2") pod "9d1291c3-e5f4-4e41-80eb-87e90c8796d0" (UID: "9d1291c3-e5f4-4e41-80eb-87e90c8796d0"). InnerVolumeSpecName "kube-api-access-sd7z2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:00:42.912019 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.911985 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-oauth-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:00:42.912019 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.912013 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sd7z2\" (UniqueName: \"kubernetes.io/projected/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-kube-api-access-sd7z2\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:00:42.912019 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.912023 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-console-serving-cert\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:00:42.912239 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.912032 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-trusted-ca-bundle\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:00:42.912239 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:42.912042 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d1291c3-e5f4-4e41-80eb-87e90c8796d0-service-ca\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:00:43.570497 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:43.570468 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f644f56d7-k6ckt_9d1291c3-e5f4-4e41-80eb-87e90c8796d0/console/0.log" Apr 17 08:00:43.570942 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:43.570509 2566 generic.go:358] "Generic (PLEG): container finished" podID="9d1291c3-e5f4-4e41-80eb-87e90c8796d0" containerID="95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043" exitCode=2 Apr 17 08:00:43.570942 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:43.570564 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f644f56d7-k6ckt" event={"ID":"9d1291c3-e5f4-4e41-80eb-87e90c8796d0","Type":"ContainerDied","Data":"95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043"} Apr 17 08:00:43.570942 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:43.570591 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f644f56d7-k6ckt" Apr 17 08:00:43.570942 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:43.570605 2566 scope.go:117] "RemoveContainer" containerID="95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043" Apr 17 08:00:43.570942 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:43.570593 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f644f56d7-k6ckt" event={"ID":"9d1291c3-e5f4-4e41-80eb-87e90c8796d0","Type":"ContainerDied","Data":"5af209709c73450a23d3690799b6011ab159627d52922af354e4f8f8df98eda9"} Apr 17 08:00:43.579210 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:43.579191 2566 scope.go:117] "RemoveContainer" containerID="95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043" Apr 17 08:00:43.579445 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:00:43.579425 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043\": container with ID starting with 95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043 not found: ID does not exist" containerID="95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043" Apr 17 08:00:43.579488 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:43.579454 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043"} err="failed to get container status \"95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043\": rpc error: code = NotFound desc = could not find container \"95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043\": container with ID starting with 95b1654247b964227840c1057158f81ef53fa104f3cff8fc0ad06f87eab7d043 not found: ID does not exist" Apr 17 08:00:43.595608 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:43.595578 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f644f56d7-k6ckt"] Apr 17 08:00:43.597669 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:43.597649 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f644f56d7-k6ckt"] Apr 17 08:00:44.531066 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:44.531035 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-shxl7" Apr 17 08:00:44.808066 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:00:44.807970 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1291c3-e5f4-4e41-80eb-87e90c8796d0" path="/var/lib/kubelet/pods/9d1291c3-e5f4-4e41-80eb-87e90c8796d0/volumes" Apr 17 08:01:16.048096 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.048059 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fwh77"] Apr 17 08:01:16.048571 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.048429 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d1291c3-e5f4-4e41-80eb-87e90c8796d0" containerName="console" Apr 17 08:01:16.048571 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.048440 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1291c3-e5f4-4e41-80eb-87e90c8796d0" containerName="console" Apr 17 08:01:16.048571 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.048492 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d1291c3-e5f4-4e41-80eb-87e90c8796d0" containerName="console" Apr 17 08:01:16.050514 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.050496 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:16.053235 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.053214 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 08:01:16.058271 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.058251 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fwh77"] Apr 17 08:01:16.146240 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.146208 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fwh77"] Apr 17 08:01:16.207119 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.207085 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5fea4445-6e62-4d55-849a-632b44ba58e3-config-file\") pod \"limitador-limitador-64c8f475fb-fwh77\" (UID: \"5fea4445-6e62-4d55-849a-632b44ba58e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:16.207277 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.207151 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x4nh\" (UniqueName: \"kubernetes.io/projected/5fea4445-6e62-4d55-849a-632b44ba58e3-kube-api-access-2x4nh\") pod \"limitador-limitador-64c8f475fb-fwh77\" (UID: \"5fea4445-6e62-4d55-849a-632b44ba58e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:16.307611 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.307516 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5fea4445-6e62-4d55-849a-632b44ba58e3-config-file\") pod \"limitador-limitador-64c8f475fb-fwh77\" (UID: \"5fea4445-6e62-4d55-849a-632b44ba58e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:16.307611 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.307604 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x4nh\" (UniqueName: \"kubernetes.io/projected/5fea4445-6e62-4d55-849a-632b44ba58e3-kube-api-access-2x4nh\") pod \"limitador-limitador-64c8f475fb-fwh77\" (UID: \"5fea4445-6e62-4d55-849a-632b44ba58e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:16.308142 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.308123 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5fea4445-6e62-4d55-849a-632b44ba58e3-config-file\") pod \"limitador-limitador-64c8f475fb-fwh77\" (UID: \"5fea4445-6e62-4d55-849a-632b44ba58e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:16.323309 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.323278 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x4nh\" (UniqueName: \"kubernetes.io/projected/5fea4445-6e62-4d55-849a-632b44ba58e3-kube-api-access-2x4nh\") pod \"limitador-limitador-64c8f475fb-fwh77\" (UID: \"5fea4445-6e62-4d55-849a-632b44ba58e3\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:16.361579 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.361548 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:16.489106 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.489079 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fwh77"] Apr 17 08:01:16.491285 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:01:16.491253 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fea4445_6e62_4d55_849a_632b44ba58e3.slice/crio-c6845b26a2e1dfa9c69413dedd5dd401b0fb3cf7746a6305158cd641c3f88eb4 WatchSource:0}: Error finding container c6845b26a2e1dfa9c69413dedd5dd401b0fb3cf7746a6305158cd641c3f88eb4: Status 404 returned error can't find the container with id c6845b26a2e1dfa9c69413dedd5dd401b0fb3cf7746a6305158cd641c3f88eb4 Apr 17 08:01:16.688808 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:16.688729 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" event={"ID":"5fea4445-6e62-4d55-849a-632b44ba58e3","Type":"ContainerStarted","Data":"c6845b26a2e1dfa9c69413dedd5dd401b0fb3cf7746a6305158cd641c3f88eb4"} Apr 17 08:01:17.693587 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:17.693548 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" event={"ID":"5fea4445-6e62-4d55-849a-632b44ba58e3","Type":"ContainerStarted","Data":"6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b"} Apr 17 08:01:17.694008 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:17.693634 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:17.715167 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:17.715117 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" podStartSLOduration=0.661738946 podStartE2EDuration="1.715101894s" podCreationTimestamp="2026-04-17 08:01:16 +0000 UTC" firstStartedPulling="2026-04-17 08:01:16.493192263 +0000 UTC m=+580.280970844" lastFinishedPulling="2026-04-17 08:01:17.546555211 +0000 UTC m=+581.334333792" observedRunningTime="2026-04-17 08:01:17.713046859 +0000 UTC m=+581.500825463" watchObservedRunningTime="2026-04-17 08:01:17.715101894 +0000 UTC m=+581.502880496" Apr 17 08:01:28.698410 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:28.698375 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:32.555549 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:32.555510 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fwh77"] Apr 17 08:01:32.556016 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:32.555837 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" podUID="5fea4445-6e62-4d55-849a-632b44ba58e3" containerName="limitador" containerID="cri-o://6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b" gracePeriod=30 Apr 17 08:01:33.097576 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.097553 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:33.156441 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.156353 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5fea4445-6e62-4d55-849a-632b44ba58e3-config-file\") pod \"5fea4445-6e62-4d55-849a-632b44ba58e3\" (UID: \"5fea4445-6e62-4d55-849a-632b44ba58e3\") " Apr 17 08:01:33.156441 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.156441 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x4nh\" (UniqueName: \"kubernetes.io/projected/5fea4445-6e62-4d55-849a-632b44ba58e3-kube-api-access-2x4nh\") pod \"5fea4445-6e62-4d55-849a-632b44ba58e3\" (UID: \"5fea4445-6e62-4d55-849a-632b44ba58e3\") " Apr 17 08:01:33.156736 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.156686 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fea4445-6e62-4d55-849a-632b44ba58e3-config-file" (OuterVolumeSpecName: "config-file") pod "5fea4445-6e62-4d55-849a-632b44ba58e3" (UID: "5fea4445-6e62-4d55-849a-632b44ba58e3"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:01:33.158668 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.158638 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fea4445-6e62-4d55-849a-632b44ba58e3-kube-api-access-2x4nh" (OuterVolumeSpecName: "kube-api-access-2x4nh") pod "5fea4445-6e62-4d55-849a-632b44ba58e3" (UID: "5fea4445-6e62-4d55-849a-632b44ba58e3"). InnerVolumeSpecName "kube-api-access-2x4nh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:01:33.257545 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.257507 2566 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5fea4445-6e62-4d55-849a-632b44ba58e3-config-file\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:01:33.257545 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.257543 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2x4nh\" (UniqueName: \"kubernetes.io/projected/5fea4445-6e62-4d55-849a-632b44ba58e3-kube-api-access-2x4nh\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:01:33.752623 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.752586 2566 generic.go:358] "Generic (PLEG): container finished" podID="5fea4445-6e62-4d55-849a-632b44ba58e3" containerID="6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b" exitCode=0 Apr 17 08:01:33.753082 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.752657 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" Apr 17 08:01:33.753082 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.752663 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" event={"ID":"5fea4445-6e62-4d55-849a-632b44ba58e3","Type":"ContainerDied","Data":"6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b"} Apr 17 08:01:33.753082 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.752723 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fwh77" event={"ID":"5fea4445-6e62-4d55-849a-632b44ba58e3","Type":"ContainerDied","Data":"c6845b26a2e1dfa9c69413dedd5dd401b0fb3cf7746a6305158cd641c3f88eb4"} Apr 17 08:01:33.753082 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.752739 2566 scope.go:117] "RemoveContainer" containerID="6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b" Apr 17 08:01:33.761162 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.761138 2566 scope.go:117] "RemoveContainer" containerID="6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b" Apr 17 08:01:33.761398 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:01:33.761380 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b\": container with ID starting with 6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b not found: ID does not exist" containerID="6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b" Apr 17 08:01:33.761484 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.761405 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b"} err="failed to get container status \"6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b\": rpc error: code = NotFound desc = could not find container \"6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b\": container with ID starting with 6b4081b8102aa6afe28a58e3f4d9d498079ec0fd7844df3da9fec6bea2ec469b not found: ID does not exist" Apr 17 08:01:33.774749 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.774719 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fwh77"] Apr 17 08:01:33.794411 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:33.794387 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fwh77"] Apr 17 08:01:34.808395 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:34.808361 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fea4445-6e62-4d55-849a-632b44ba58e3" path="/var/lib/kubelet/pods/5fea4445-6e62-4d55-849a-632b44ba58e3/volumes" Apr 17 08:01:36.742780 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:36.742748 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:01:36.743222 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:36.742945 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:01:36.747980 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:36.747958 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:01:36.748094 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:36.747957 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:01:51.701029 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.700939 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc"] Apr 17 08:01:51.701568 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.701297 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5fea4445-6e62-4d55-849a-632b44ba58e3" containerName="limitador" Apr 17 08:01:51.701568 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.701307 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fea4445-6e62-4d55-849a-632b44ba58e3" containerName="limitador" Apr 17 08:01:51.701568 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.701363 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5fea4445-6e62-4d55-849a-632b44ba58e3" containerName="limitador" Apr 17 08:01:51.703452 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.703434 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.716674 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.716649 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc"] Apr 17 08:01:51.824685 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.824646 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a35ffed7-a2c6-4c47-8b54-34a98ed44373-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.824878 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.824693 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a35ffed7-a2c6-4c47-8b54-34a98ed44373-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.824878 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.824726 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.824878 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.824759 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmd5t\" (UniqueName: \"kubernetes.io/projected/a35ffed7-a2c6-4c47-8b54-34a98ed44373-kube-api-access-bmd5t\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.824995 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.824900 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.824995 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.824957 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.824995 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.824984 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.925959 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.925915 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.926159 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.926012 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.926159 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.926060 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.926159 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.926115 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a35ffed7-a2c6-4c47-8b54-34a98ed44373-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.926159 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.926148 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a35ffed7-a2c6-4c47-8b54-34a98ed44373-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.926383 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.926175 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.926383 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.926201 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmd5t\" (UniqueName: \"kubernetes.io/projected/a35ffed7-a2c6-4c47-8b54-34a98ed44373-kube-api-access-bmd5t\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.927012 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.926988 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.929037 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.928956 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.929158 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.929045 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.929158 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.929108 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a35ffed7-a2c6-4c47-8b54-34a98ed44373-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.929772 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.929240 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a35ffed7-a2c6-4c47-8b54-34a98ed44373-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.935369 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.935344 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmd5t\" (UniqueName: \"kubernetes.io/projected/a35ffed7-a2c6-4c47-8b54-34a98ed44373-kube-api-access-bmd5t\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:51.935458 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:51.935443 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a35ffed7-a2c6-4c47-8b54-34a98ed44373-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-hwfxc\" (UID: \"a35ffed7-a2c6-4c47-8b54-34a98ed44373\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:52.013019 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:52.012932 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:52.160197 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:52.160167 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc"] Apr 17 08:01:52.161088 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:01:52.161060 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda35ffed7_a2c6_4c47_8b54_34a98ed44373.slice/crio-257aed1e9bfa5839824c1177fc641d408e5087a8401c4b7d4ebe6a527db8b58a WatchSource:0}: Error finding container 257aed1e9bfa5839824c1177fc641d408e5087a8401c4b7d4ebe6a527db8b58a: Status 404 returned error can't find the container with id 257aed1e9bfa5839824c1177fc641d408e5087a8401c4b7d4ebe6a527db8b58a Apr 17 08:01:52.163768 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:52.163730 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 08:01:52.163854 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:52.163809 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 08:01:52.819564 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:52.819531 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" event={"ID":"a35ffed7-a2c6-4c47-8b54-34a98ed44373","Type":"ContainerStarted","Data":"c9b14644c60fd8d275e58b45ee0f5b6bde01567e336bbb06e4adf4c6dac192a5"} Apr 17 08:01:52.819564 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:52.819565 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" event={"ID":"a35ffed7-a2c6-4c47-8b54-34a98ed44373","Type":"ContainerStarted","Data":"257aed1e9bfa5839824c1177fc641d408e5087a8401c4b7d4ebe6a527db8b58a"} Apr 17 08:01:52.819984 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:52.819591 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:52.879891 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:52.879842 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" podStartSLOduration=1.879827096 podStartE2EDuration="1.879827096s" podCreationTimestamp="2026-04-17 08:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:01:52.875980687 +0000 UTC m=+616.663759290" watchObservedRunningTime="2026-04-17 08:01:52.879827096 +0000 UTC m=+616.667605698" Apr 17 08:01:53.825335 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:53.825304 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-hwfxc" Apr 17 08:01:53.901308 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:53.901274 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7"] Apr 17 08:01:53.901542 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:53.901518 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" podUID="e99ba96d-63d9-442e-9663-99434065a88b" containerName="discovery" containerID="cri-o://ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2" gracePeriod=30 Apr 17 08:01:54.148084 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.148062 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 08:01:54.247909 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.247880 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-kubeconfig\") pod \"e99ba96d-63d9-442e-9663-99434065a88b\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " Apr 17 08:01:54.248098 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.247952 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e99ba96d-63d9-442e-9663-99434065a88b-local-certs\") pod \"e99ba96d-63d9-442e-9663-99434065a88b\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " Apr 17 08:01:54.248098 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.247978 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-ca-configmap\") pod \"e99ba96d-63d9-442e-9663-99434065a88b\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " Apr 17 08:01:54.248098 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.248019 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-dns-cert\") pod \"e99ba96d-63d9-442e-9663-99434065a88b\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " Apr 17 08:01:54.248274 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.248131 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-istio-token\") pod \"e99ba96d-63d9-442e-9663-99434065a88b\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " Apr 17 08:01:54.248274 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.248172 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-cacerts\") pod \"e99ba96d-63d9-442e-9663-99434065a88b\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " Apr 17 08:01:54.248274 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.248220 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrcz9\" (UniqueName: \"kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-kube-api-access-qrcz9\") pod \"e99ba96d-63d9-442e-9663-99434065a88b\" (UID: \"e99ba96d-63d9-442e-9663-99434065a88b\") " Apr 17 08:01:54.248443 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.248426 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "e99ba96d-63d9-442e-9663-99434065a88b" (UID: "e99ba96d-63d9-442e-9663-99434065a88b"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:01:54.248666 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.248634 2566 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-ca-configmap\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:01:54.250459 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.250431 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "e99ba96d-63d9-442e-9663-99434065a88b" (UID: "e99ba96d-63d9-442e-9663-99434065a88b"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:01:54.250688 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.250660 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-istio-token" (OuterVolumeSpecName: "istio-token") pod "e99ba96d-63d9-442e-9663-99434065a88b" (UID: "e99ba96d-63d9-442e-9663-99434065a88b"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:01:54.250688 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.250676 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e99ba96d-63d9-442e-9663-99434065a88b-local-certs" (OuterVolumeSpecName: "local-certs") pod "e99ba96d-63d9-442e-9663-99434065a88b" (UID: "e99ba96d-63d9-442e-9663-99434065a88b"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:01:54.250688 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.250673 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-kube-api-access-qrcz9" (OuterVolumeSpecName: "kube-api-access-qrcz9") pod "e99ba96d-63d9-442e-9663-99434065a88b" (UID: "e99ba96d-63d9-442e-9663-99434065a88b"). InnerVolumeSpecName "kube-api-access-qrcz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:01:54.250909 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.250793 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "e99ba96d-63d9-442e-9663-99434065a88b" (UID: "e99ba96d-63d9-442e-9663-99434065a88b"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:01:54.250909 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.250858 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-cacerts" (OuterVolumeSpecName: "cacerts") pod "e99ba96d-63d9-442e-9663-99434065a88b" (UID: "e99ba96d-63d9-442e-9663-99434065a88b"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:01:54.349155 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.349121 2566 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-csr-dns-cert\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:01:54.349155 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.349157 2566 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-istio-token\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:01:54.349388 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.349171 2566 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-cacerts\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:01:54.349388 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.349185 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrcz9\" (UniqueName: \"kubernetes.io/projected/e99ba96d-63d9-442e-9663-99434065a88b-kube-api-access-qrcz9\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:01:54.349388 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.349199 2566 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e99ba96d-63d9-442e-9663-99434065a88b-istio-kubeconfig\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:01:54.349388 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.349212 2566 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e99ba96d-63d9-442e-9663-99434065a88b-local-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:01:54.827391 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.827360 2566 generic.go:358] "Generic (PLEG): container finished" podID="e99ba96d-63d9-442e-9663-99434065a88b" containerID="ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2" exitCode=0 Apr 17 08:01:54.827766 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.827419 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" Apr 17 08:01:54.827766 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.827452 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" event={"ID":"e99ba96d-63d9-442e-9663-99434065a88b","Type":"ContainerDied","Data":"ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2"} Apr 17 08:01:54.827766 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.827492 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7" event={"ID":"e99ba96d-63d9-442e-9663-99434065a88b","Type":"ContainerDied","Data":"6bb9efd033d79fc54a35468776a7d4d12d7096a2246d9f8679b7777e230d3e4f"} Apr 17 08:01:54.827766 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.827517 2566 scope.go:117] "RemoveContainer" containerID="ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2" Apr 17 08:01:54.838189 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.838165 2566 scope.go:117] "RemoveContainer" containerID="ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2" Apr 17 08:01:54.838433 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:01:54.838417 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2\": container with ID starting with ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2 not found: ID does not exist" containerID="ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2" Apr 17 08:01:54.838501 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.838441 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2"} err="failed to get container status \"ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2\": rpc error: code = NotFound desc = could not find container \"ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2\": container with ID starting with ec653af0036f3c0dd6be593860a8abbc037c23e4a17cbf77bd33d6dca24c83b2 not found: ID does not exist" Apr 17 08:01:54.863727 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.863685 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7"] Apr 17 08:01:54.894759 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:54.894728 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2wsk7"] Apr 17 08:01:56.812872 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:01:56.812836 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e99ba96d-63d9-442e-9663-99434065a88b" path="/var/lib/kubelet/pods/e99ba96d-63d9-442e-9663-99434065a88b/volumes" Apr 17 08:02:00.360066 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.360034 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-558564fd68-wxgtc"] Apr 17 08:02:00.360461 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.360434 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e99ba96d-63d9-442e-9663-99434065a88b" containerName="discovery" Apr 17 08:02:00.360461 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.360458 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99ba96d-63d9-442e-9663-99434065a88b" containerName="discovery" Apr 17 08:02:00.360588 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.360575 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e99ba96d-63d9-442e-9663-99434065a88b" containerName="discovery" Apr 17 08:02:00.363673 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.363654 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:00.366181 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.366153 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 08:02:00.367366 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.367346 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-4xj5s\"" Apr 17 08:02:00.367492 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.367348 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 08:02:00.367492 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.367350 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 08:02:00.371993 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.371975 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-wxgtc"] Apr 17 08:02:00.375189 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.375156 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-847974b58-t4k6h"] Apr 17 08:02:00.377407 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.377393 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:02:00.379783 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.379764 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 08:02:00.379863 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.379794 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-tz5w9\"" Apr 17 08:02:00.388862 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.388826 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-847974b58-t4k6h"] Apr 17 08:02:00.508394 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.508354 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prq6j\" (UniqueName: \"kubernetes.io/projected/0f96c13c-f21f-4805-8661-ab8ee2be0169-kube-api-access-prq6j\") pod \"llmisvc-controller-manager-847974b58-t4k6h\" (UID: \"0f96c13c-f21f-4805-8661-ab8ee2be0169\") " pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:02:00.508563 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.508416 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f96c13c-f21f-4805-8661-ab8ee2be0169-cert\") pod \"llmisvc-controller-manager-847974b58-t4k6h\" (UID: \"0f96c13c-f21f-4805-8661-ab8ee2be0169\") " pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:02:00.508563 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.508459 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/250fa35b-9f76-458d-8bbc-63c2c43329ea-cert\") pod \"kserve-controller-manager-558564fd68-wxgtc\" (UID: \"250fa35b-9f76-458d-8bbc-63c2c43329ea\") " pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:00.508563 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.508522 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4w76\" (UniqueName: \"kubernetes.io/projected/250fa35b-9f76-458d-8bbc-63c2c43329ea-kube-api-access-w4w76\") pod \"kserve-controller-manager-558564fd68-wxgtc\" (UID: \"250fa35b-9f76-458d-8bbc-63c2c43329ea\") " pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:00.609309 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.609273 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f96c13c-f21f-4805-8661-ab8ee2be0169-cert\") pod \"llmisvc-controller-manager-847974b58-t4k6h\" (UID: \"0f96c13c-f21f-4805-8661-ab8ee2be0169\") " pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:02:00.609309 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.609307 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/250fa35b-9f76-458d-8bbc-63c2c43329ea-cert\") pod \"kserve-controller-manager-558564fd68-wxgtc\" (UID: \"250fa35b-9f76-458d-8bbc-63c2c43329ea\") " pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:00.609539 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.609509 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4w76\" (UniqueName: \"kubernetes.io/projected/250fa35b-9f76-458d-8bbc-63c2c43329ea-kube-api-access-w4w76\") pod \"kserve-controller-manager-558564fd68-wxgtc\" (UID: \"250fa35b-9f76-458d-8bbc-63c2c43329ea\") " pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:00.609685 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.609635 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prq6j\" (UniqueName: \"kubernetes.io/projected/0f96c13c-f21f-4805-8661-ab8ee2be0169-kube-api-access-prq6j\") pod \"llmisvc-controller-manager-847974b58-t4k6h\" (UID: \"0f96c13c-f21f-4805-8661-ab8ee2be0169\") " pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:02:00.611967 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.611901 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f96c13c-f21f-4805-8661-ab8ee2be0169-cert\") pod \"llmisvc-controller-manager-847974b58-t4k6h\" (UID: \"0f96c13c-f21f-4805-8661-ab8ee2be0169\") " pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:02:00.612110 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.612092 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/250fa35b-9f76-458d-8bbc-63c2c43329ea-cert\") pod \"kserve-controller-manager-558564fd68-wxgtc\" (UID: \"250fa35b-9f76-458d-8bbc-63c2c43329ea\") " pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:00.620591 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.620570 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4w76\" (UniqueName: \"kubernetes.io/projected/250fa35b-9f76-458d-8bbc-63c2c43329ea-kube-api-access-w4w76\") pod \"kserve-controller-manager-558564fd68-wxgtc\" (UID: \"250fa35b-9f76-458d-8bbc-63c2c43329ea\") " pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:00.620777 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.620756 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prq6j\" (UniqueName: \"kubernetes.io/projected/0f96c13c-f21f-4805-8661-ab8ee2be0169-kube-api-access-prq6j\") pod \"llmisvc-controller-manager-847974b58-t4k6h\" (UID: \"0f96c13c-f21f-4805-8661-ab8ee2be0169\") " pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:02:00.675564 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.675527 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:00.688314 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.688287 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:02:00.810436 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.810329 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-wxgtc"] Apr 17 08:02:00.832972 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.832950 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-847974b58-t4k6h"] Apr 17 08:02:00.835007 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:02:00.834981 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0f96c13c_f21f_4805_8661_ab8ee2be0169.slice/crio-b11036dabc6dc16743fbf0866da25d8df92b788dfddaa7b62ebb821289a8b933 WatchSource:0}: Error finding container b11036dabc6dc16743fbf0866da25d8df92b788dfddaa7b62ebb821289a8b933: Status 404 returned error can't find the container with id b11036dabc6dc16743fbf0866da25d8df92b788dfddaa7b62ebb821289a8b933 Apr 17 08:02:00.847982 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.847946 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" event={"ID":"0f96c13c-f21f-4805-8661-ab8ee2be0169","Type":"ContainerStarted","Data":"b11036dabc6dc16743fbf0866da25d8df92b788dfddaa7b62ebb821289a8b933"} Apr 17 08:02:00.848920 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:00.848899 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" event={"ID":"250fa35b-9f76-458d-8bbc-63c2c43329ea","Type":"ContainerStarted","Data":"828ce6e7f09b9d33c7104c319b77f3900f9d42bd60a6e9065b6deff791c1b9b3"} Apr 17 08:02:03.862793 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:03.862675 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" event={"ID":"250fa35b-9f76-458d-8bbc-63c2c43329ea","Type":"ContainerStarted","Data":"27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9"} Apr 17 08:02:03.863238 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:03.862929 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:03.879079 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:03.879028 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" podStartSLOduration=1.082560691 podStartE2EDuration="3.879014214s" podCreationTimestamp="2026-04-17 08:02:00 +0000 UTC" firstStartedPulling="2026-04-17 08:02:00.811736904 +0000 UTC m=+624.599515485" lastFinishedPulling="2026-04-17 08:02:03.608190418 +0000 UTC m=+627.395969008" observedRunningTime="2026-04-17 08:02:03.877606644 +0000 UTC m=+627.665385249" watchObservedRunningTime="2026-04-17 08:02:03.879014214 +0000 UTC m=+627.666792817" Apr 17 08:02:04.868653 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:04.868615 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" event={"ID":"0f96c13c-f21f-4805-8661-ab8ee2be0169","Type":"ContainerStarted","Data":"700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4"} Apr 17 08:02:04.869053 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:04.868933 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:02:04.885998 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:04.885952 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" podStartSLOduration=1.473792611 podStartE2EDuration="4.885937797s" podCreationTimestamp="2026-04-17 08:02:00 +0000 UTC" firstStartedPulling="2026-04-17 08:02:00.836343335 +0000 UTC m=+624.624121917" lastFinishedPulling="2026-04-17 08:02:04.248488517 +0000 UTC m=+628.036267103" observedRunningTime="2026-04-17 08:02:04.884405521 +0000 UTC m=+628.672184124" watchObservedRunningTime="2026-04-17 08:02:04.885937797 +0000 UTC m=+628.673716400" Apr 17 08:02:34.874832 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:34.874797 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:35.876792 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:35.876755 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:02:37.134189 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.134151 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-wxgtc"] Apr 17 08:02:37.134579 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.134384 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" podUID="250fa35b-9f76-458d-8bbc-63c2c43329ea" containerName="manager" containerID="cri-o://27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9" gracePeriod=10 Apr 17 08:02:37.156798 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.156772 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-558564fd68-6vcd4"] Apr 17 08:02:37.159289 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.159273 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-6vcd4" Apr 17 08:02:37.170722 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.170682 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-6vcd4"] Apr 17 08:02:37.237405 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.237366 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb92f7ad-35c9-40ee-8463-f11c96eeb371-cert\") pod \"kserve-controller-manager-558564fd68-6vcd4\" (UID: \"eb92f7ad-35c9-40ee-8463-f11c96eeb371\") " pod="kserve/kserve-controller-manager-558564fd68-6vcd4" Apr 17 08:02:37.237405 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.237409 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86q95\" (UniqueName: \"kubernetes.io/projected/eb92f7ad-35c9-40ee-8463-f11c96eeb371-kube-api-access-86q95\") pod \"kserve-controller-manager-558564fd68-6vcd4\" (UID: \"eb92f7ad-35c9-40ee-8463-f11c96eeb371\") " pod="kserve/kserve-controller-manager-558564fd68-6vcd4" Apr 17 08:02:37.338312 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.338276 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86q95\" (UniqueName: \"kubernetes.io/projected/eb92f7ad-35c9-40ee-8463-f11c96eeb371-kube-api-access-86q95\") pod \"kserve-controller-manager-558564fd68-6vcd4\" (UID: \"eb92f7ad-35c9-40ee-8463-f11c96eeb371\") " pod="kserve/kserve-controller-manager-558564fd68-6vcd4" Apr 17 08:02:37.338482 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.338408 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb92f7ad-35c9-40ee-8463-f11c96eeb371-cert\") pod \"kserve-controller-manager-558564fd68-6vcd4\" (UID: \"eb92f7ad-35c9-40ee-8463-f11c96eeb371\") " pod="kserve/kserve-controller-manager-558564fd68-6vcd4" Apr 17 08:02:37.341156 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.341133 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb92f7ad-35c9-40ee-8463-f11c96eeb371-cert\") pod \"kserve-controller-manager-558564fd68-6vcd4\" (UID: \"eb92f7ad-35c9-40ee-8463-f11c96eeb371\") " pod="kserve/kserve-controller-manager-558564fd68-6vcd4" Apr 17 08:02:37.346499 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.346467 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86q95\" (UniqueName: \"kubernetes.io/projected/eb92f7ad-35c9-40ee-8463-f11c96eeb371-kube-api-access-86q95\") pod \"kserve-controller-manager-558564fd68-6vcd4\" (UID: \"eb92f7ad-35c9-40ee-8463-f11c96eeb371\") " pod="kserve/kserve-controller-manager-558564fd68-6vcd4" Apr 17 08:02:37.375071 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.375045 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:37.513924 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.513816 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-6vcd4" Apr 17 08:02:37.540913 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.540873 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/250fa35b-9f76-458d-8bbc-63c2c43329ea-cert\") pod \"250fa35b-9f76-458d-8bbc-63c2c43329ea\" (UID: \"250fa35b-9f76-458d-8bbc-63c2c43329ea\") " Apr 17 08:02:37.541078 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.541003 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4w76\" (UniqueName: \"kubernetes.io/projected/250fa35b-9f76-458d-8bbc-63c2c43329ea-kube-api-access-w4w76\") pod \"250fa35b-9f76-458d-8bbc-63c2c43329ea\" (UID: \"250fa35b-9f76-458d-8bbc-63c2c43329ea\") " Apr 17 08:02:37.543143 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.543112 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250fa35b-9f76-458d-8bbc-63c2c43329ea-cert" (OuterVolumeSpecName: "cert") pod "250fa35b-9f76-458d-8bbc-63c2c43329ea" (UID: "250fa35b-9f76-458d-8bbc-63c2c43329ea"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:02:37.543143 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.543135 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250fa35b-9f76-458d-8bbc-63c2c43329ea-kube-api-access-w4w76" (OuterVolumeSpecName: "kube-api-access-w4w76") pod "250fa35b-9f76-458d-8bbc-63c2c43329ea" (UID: "250fa35b-9f76-458d-8bbc-63c2c43329ea"). InnerVolumeSpecName "kube-api-access-w4w76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:02:37.641831 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.641622 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-6vcd4"] Apr 17 08:02:37.642070 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.642048 2566 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/250fa35b-9f76-458d-8bbc-63c2c43329ea-cert\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:02:37.642112 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.642079 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4w76\" (UniqueName: \"kubernetes.io/projected/250fa35b-9f76-458d-8bbc-63c2c43329ea-kube-api-access-w4w76\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:02:37.644568 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:02:37.644547 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb92f7ad_35c9_40ee_8463_f11c96eeb371.slice/crio-f942daf592b1c2f5bb45c2578165accd7df3aceeb73fd19786330f9a291bf593 WatchSource:0}: Error finding container f942daf592b1c2f5bb45c2578165accd7df3aceeb73fd19786330f9a291bf593: Status 404 returned error can't find the container with id f942daf592b1c2f5bb45c2578165accd7df3aceeb73fd19786330f9a291bf593 Apr 17 08:02:37.997257 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.997223 2566 generic.go:358] "Generic (PLEG): container finished" podID="250fa35b-9f76-458d-8bbc-63c2c43329ea" containerID="27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9" exitCode=0 Apr 17 08:02:37.997430 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.997283 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" Apr 17 08:02:37.997430 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.997303 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" event={"ID":"250fa35b-9f76-458d-8bbc-63c2c43329ea","Type":"ContainerDied","Data":"27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9"} Apr 17 08:02:37.997430 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.997352 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-wxgtc" event={"ID":"250fa35b-9f76-458d-8bbc-63c2c43329ea","Type":"ContainerDied","Data":"828ce6e7f09b9d33c7104c319b77f3900f9d42bd60a6e9065b6deff791c1b9b3"} Apr 17 08:02:37.997430 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.997375 2566 scope.go:117] "RemoveContainer" containerID="27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9" Apr 17 08:02:37.998816 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.998788 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-6vcd4" event={"ID":"eb92f7ad-35c9-40ee-8463-f11c96eeb371","Type":"ContainerStarted","Data":"67be1ce016e9f3d911ac61cfca081f44668edc718a7b40d33f947f4cc7c1fb7c"} Apr 17 08:02:37.998955 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.998823 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-6vcd4" event={"ID":"eb92f7ad-35c9-40ee-8463-f11c96eeb371","Type":"ContainerStarted","Data":"f942daf592b1c2f5bb45c2578165accd7df3aceeb73fd19786330f9a291bf593"} Apr 17 08:02:37.998955 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:37.998901 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-558564fd68-6vcd4" Apr 17 08:02:38.009007 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:38.008981 2566 scope.go:117] "RemoveContainer" containerID="27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9" Apr 17 08:02:38.009352 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:02:38.009332 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9\": container with ID starting with 27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9 not found: ID does not exist" containerID="27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9" Apr 17 08:02:38.009418 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:38.009360 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9"} err="failed to get container status \"27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9\": rpc error: code = NotFound desc = could not find container \"27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9\": container with ID starting with 27e675344d512b5e819ce1f5fd2cb380a080d61707492add142d8f42d8413fb9 not found: ID does not exist" Apr 17 08:02:38.026140 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:38.026063 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-558564fd68-6vcd4" podStartSLOduration=0.731288536 podStartE2EDuration="1.026050415s" podCreationTimestamp="2026-04-17 08:02:37 +0000 UTC" firstStartedPulling="2026-04-17 08:02:37.645952565 +0000 UTC m=+661.433731145" lastFinishedPulling="2026-04-17 08:02:37.940714438 +0000 UTC m=+661.728493024" observedRunningTime="2026-04-17 08:02:38.025307628 +0000 UTC m=+661.813086233" watchObservedRunningTime="2026-04-17 08:02:38.026050415 +0000 UTC m=+661.813829063" Apr 17 08:02:38.042692 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:38.042661 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-wxgtc"] Apr 17 08:02:38.047088 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:38.047062 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-wxgtc"] Apr 17 08:02:38.808603 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:02:38.808569 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250fa35b-9f76-458d-8bbc-63c2c43329ea" path="/var/lib/kubelet/pods/250fa35b-9f76-458d-8bbc-63c2c43329ea/volumes" Apr 17 08:03:09.007757 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:09.007724 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-558564fd68-6vcd4" Apr 17 08:03:46.983027 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:46.982992 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k"] Apr 17 08:03:46.983650 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:46.983576 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="250fa35b-9f76-458d-8bbc-63c2c43329ea" containerName="manager" Apr 17 08:03:46.983650 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:46.983596 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="250fa35b-9f76-458d-8bbc-63c2c43329ea" containerName="manager" Apr 17 08:03:46.983849 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:46.983731 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="250fa35b-9f76-458d-8bbc-63c2c43329ea" containerName="manager" Apr 17 08:03:46.986232 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:46.986198 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:46.990044 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:46.988906 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 17 08:03:46.990044 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:46.988958 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-82jx8\"" Apr 17 08:03:46.990044 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:46.989158 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 08:03:46.990394 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:46.990372 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 08:03:46.996966 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:46.996923 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k"] Apr 17 08:03:47.056836 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.056799 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.057003 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.056842 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ae3a165d-26cd-42e0-801f-5520ea324e30-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.057003 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.056867 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.057003 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.056916 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.057003 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.056938 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.057003 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.056978 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.057185 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.057007 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs64v\" (UniqueName: \"kubernetes.io/projected/ae3a165d-26cd-42e0-801f-5520ea324e30-kube-api-access-hs64v\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.057185 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.057077 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.057185 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.057096 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.158174 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158126 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.158360 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158187 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs64v\" (UniqueName: \"kubernetes.io/projected/ae3a165d-26cd-42e0-801f-5520ea324e30-kube-api-access-hs64v\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.158360 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158266 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.158360 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158297 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.158533 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158395 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.158533 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158439 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ae3a165d-26cd-42e0-801f-5520ea324e30-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.158533 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158470 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.159020 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158533 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.159020 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158573 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.159020 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158738 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.159020 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158788 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.159020 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158969 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.159020 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.158999 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.159766 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.159739 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ae3a165d-26cd-42e0-801f-5520ea324e30-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.160640 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.160613 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.161368 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.161340 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.167769 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.167671 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ae3a165d-26cd-42e0-801f-5520ea324e30-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.167854 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.167766 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs64v\" (UniqueName: \"kubernetes.io/projected/ae3a165d-26cd-42e0-801f-5520ea324e30-kube-api-access-hs64v\") pod \"router-gateway-1-openshift-default-6c59fbf55c-j7t8k\" (UID: \"ae3a165d-26cd-42e0-801f-5520ea324e30\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.300891 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.300853 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:47.449011 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.448969 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k"] Apr 17 08:03:47.452411 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:03:47.452378 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae3a165d_26cd_42e0_801f_5520ea324e30.slice/crio-95f929df658273ef6044555eb96780db993e089b8ec03ccbe4a8160057284fff WatchSource:0}: Error finding container 95f929df658273ef6044555eb96780db993e089b8ec03ccbe4a8160057284fff: Status 404 returned error can't find the container with id 95f929df658273ef6044555eb96780db993e089b8ec03ccbe4a8160057284fff Apr 17 08:03:47.454213 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.454195 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:03:47.454532 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.454497 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 08:03:47.454614 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.454563 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 08:03:47.454614 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:47.454596 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 08:03:48.242937 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:48.242899 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" event={"ID":"ae3a165d-26cd-42e0-801f-5520ea324e30","Type":"ContainerStarted","Data":"0a677e99d3eb50fdac9a6d0c4b1c1fa722017f70cbd440c70de084d370b9a182"} Apr 17 08:03:48.242937 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:48.242938 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" event={"ID":"ae3a165d-26cd-42e0-801f-5520ea324e30","Type":"ContainerStarted","Data":"95f929df658273ef6044555eb96780db993e089b8ec03ccbe4a8160057284fff"} Apr 17 08:03:48.265171 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:48.265108 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" podStartSLOduration=2.265089272 podStartE2EDuration="2.265089272s" podCreationTimestamp="2026-04-17 08:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:03:48.263518791 +0000 UTC m=+732.051297395" watchObservedRunningTime="2026-04-17 08:03:48.265089272 +0000 UTC m=+732.052867876" Apr 17 08:03:48.301185 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:48.301142 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:48.306159 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:48.306130 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:49.246544 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:49.246510 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:03:49.247765 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:03:49.247743 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-j7t8k" Apr 17 08:04:15.382474 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.382430 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp"] Apr 17 08:04:15.386049 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.386021 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.388724 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.388681 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-qkjjf\"" Apr 17 08:04:15.388919 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.388723 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:04:15.389858 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.389833 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 17 08:04:15.398568 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.398542 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp"] Apr 17 08:04:15.523089 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.523038 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.523276 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.523110 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc9xc\" (UniqueName: \"kubernetes.io/projected/4dd7b636-c091-48e3-93bf-d9e659406211-kube-api-access-mc9xc\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.523276 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.523182 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.523276 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.523212 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.523276 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.523248 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.523446 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.523343 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd7b636-c091-48e3-93bf-d9e659406211-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.624296 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.624241 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.624493 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.624309 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mc9xc\" (UniqueName: \"kubernetes.io/projected/4dd7b636-c091-48e3-93bf-d9e659406211-kube-api-access-mc9xc\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.624493 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.624353 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.624493 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.624376 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.624493 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.624410 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.624746 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.624500 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd7b636-c091-48e3-93bf-d9e659406211-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.624746 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.624658 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.624928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.624766 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.624928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.624864 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.624928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.624868 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.627069 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.627043 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd7b636-c091-48e3-93bf-d9e659406211-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.632240 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.632216 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc9xc\" (UniqueName: \"kubernetes.io/projected/4dd7b636-c091-48e3-93bf-d9e659406211-kube-api-access-mc9xc\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.698538 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.698434 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:15.842626 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:15.842588 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp"] Apr 17 08:04:15.846562 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:04:15.846535 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd7b636_c091_48e3_93bf_d9e659406211.slice/crio-67aea26b9cb740ac33ebe1faa73f7b556a2afd5fa91d5e22b6840b4f5f03ea0e WatchSource:0}: Error finding container 67aea26b9cb740ac33ebe1faa73f7b556a2afd5fa91d5e22b6840b4f5f03ea0e: Status 404 returned error can't find the container with id 67aea26b9cb740ac33ebe1faa73f7b556a2afd5fa91d5e22b6840b4f5f03ea0e Apr 17 08:04:16.361071 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:16.361032 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" event={"ID":"4dd7b636-c091-48e3-93bf-d9e659406211","Type":"ContainerStarted","Data":"67aea26b9cb740ac33ebe1faa73f7b556a2afd5fa91d5e22b6840b4f5f03ea0e"} Apr 17 08:04:19.377206 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:19.377158 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" event={"ID":"4dd7b636-c091-48e3-93bf-d9e659406211","Type":"ContainerStarted","Data":"4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718"} Apr 17 08:04:20.382836 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:20.382798 2566 generic.go:358] "Generic (PLEG): container finished" podID="4dd7b636-c091-48e3-93bf-d9e659406211" containerID="4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718" exitCode=0 Apr 17 08:04:20.383272 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:20.382893 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" event={"ID":"4dd7b636-c091-48e3-93bf-d9e659406211","Type":"ContainerDied","Data":"4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718"} Apr 17 08:04:22.394736 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:22.394666 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" event={"ID":"4dd7b636-c091-48e3-93bf-d9e659406211","Type":"ContainerStarted","Data":"2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1"} Apr 17 08:04:53.524165 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:53.524124 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" event={"ID":"4dd7b636-c091-48e3-93bf-d9e659406211","Type":"ContainerStarted","Data":"9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af"} Apr 17 08:04:53.524628 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:53.524444 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:53.527008 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:53.526988 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:53.546125 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:53.546068 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" podStartSLOduration=1.572053059 podStartE2EDuration="38.546053116s" podCreationTimestamp="2026-04-17 08:04:15 +0000 UTC" firstStartedPulling="2026-04-17 08:04:15.848977127 +0000 UTC m=+759.636755708" lastFinishedPulling="2026-04-17 08:04:52.822977181 +0000 UTC m=+796.610755765" observedRunningTime="2026-04-17 08:04:53.544955819 +0000 UTC m=+797.332734423" watchObservedRunningTime="2026-04-17 08:04:53.546053116 +0000 UTC m=+797.333831720" Apr 17 08:04:55.699494 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:55.699457 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:04:55.699494 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:04:55.699501 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:05:05.701050 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:05.701018 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:05:05.702249 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:05.702227 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:05:27.465857 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:27.465817 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp"] Apr 17 08:05:27.466451 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:27.466256 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" containerName="main" containerID="cri-o://2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1" gracePeriod=30 Apr 17 08:05:27.466517 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:27.466458 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" containerName="tokenizer" containerID="cri-o://9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af" gracePeriod=30 Apr 17 08:05:27.653661 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:27.653629 2566 generic.go:358] "Generic (PLEG): container finished" podID="4dd7b636-c091-48e3-93bf-d9e659406211" containerID="2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1" exitCode=0 Apr 17 08:05:27.653865 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:27.653726 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" event={"ID":"4dd7b636-c091-48e3-93bf-d9e659406211","Type":"ContainerDied","Data":"2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1"} Apr 17 08:05:32.608386 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.608360 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:05:32.673383 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.673303 2566 generic.go:358] "Generic (PLEG): container finished" podID="4dd7b636-c091-48e3-93bf-d9e659406211" containerID="9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af" exitCode=0 Apr 17 08:05:32.673383 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.673371 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" event={"ID":"4dd7b636-c091-48e3-93bf-d9e659406211","Type":"ContainerDied","Data":"9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af"} Apr 17 08:05:32.673561 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.673400 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" Apr 17 08:05:32.673561 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.673405 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp" event={"ID":"4dd7b636-c091-48e3-93bf-d9e659406211","Type":"ContainerDied","Data":"67aea26b9cb740ac33ebe1faa73f7b556a2afd5fa91d5e22b6840b4f5f03ea0e"} Apr 17 08:05:32.673561 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.673425 2566 scope.go:117] "RemoveContainer" containerID="9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af" Apr 17 08:05:32.681955 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.681936 2566 scope.go:117] "RemoveContainer" containerID="2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1" Apr 17 08:05:32.689403 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.689377 2566 scope.go:117] "RemoveContainer" containerID="4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718" Apr 17 08:05:32.696614 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.696593 2566 scope.go:117] "RemoveContainer" containerID="9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af" Apr 17 08:05:32.696891 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:05:32.696871 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af\": container with ID starting with 9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af not found: ID does not exist" containerID="9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af" Apr 17 08:05:32.696943 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.696904 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af"} err="failed to get container status \"9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af\": rpc error: code = NotFound desc = could not find container \"9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af\": container with ID starting with 9ca3a2a4b2117a3ad77e1b793e7be0fdd1fde039f5a21f81a5f2df7d70a794af not found: ID does not exist" Apr 17 08:05:32.696943 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.696925 2566 scope.go:117] "RemoveContainer" containerID="2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1" Apr 17 08:05:32.697166 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:05:32.697147 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1\": container with ID starting with 2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1 not found: ID does not exist" containerID="2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1" Apr 17 08:05:32.697230 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.697176 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1"} err="failed to get container status \"2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1\": rpc error: code = NotFound desc = could not find container \"2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1\": container with ID starting with 2ef0d31b52b5454c6b0d9cf33a30faf55a969ccb52523fefeb91a37427c44db1 not found: ID does not exist" Apr 17 08:05:32.697230 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.697200 2566 scope.go:117] "RemoveContainer" containerID="4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718" Apr 17 08:05:32.697420 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:05:32.697405 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718\": container with ID starting with 4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718 not found: ID does not exist" containerID="4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718" Apr 17 08:05:32.697458 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.697424 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718"} err="failed to get container status \"4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718\": rpc error: code = NotFound desc = could not find container \"4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718\": container with ID starting with 4f1f9629febd9db930fdc97dd7926d1442668bcd72b673924e4e518e0d110718 not found: ID does not exist" Apr 17 08:05:32.703784 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.703764 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd7b636-c091-48e3-93bf-d9e659406211-tls-certs\") pod \"4dd7b636-c091-48e3-93bf-d9e659406211\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " Apr 17 08:05:32.703865 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.703851 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-uds\") pod \"4dd7b636-c091-48e3-93bf-d9e659406211\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " Apr 17 08:05:32.703922 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.703885 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc9xc\" (UniqueName: \"kubernetes.io/projected/4dd7b636-c091-48e3-93bf-d9e659406211-kube-api-access-mc9xc\") pod \"4dd7b636-c091-48e3-93bf-d9e659406211\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " Apr 17 08:05:32.703974 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.703931 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-kserve-provision-location\") pod \"4dd7b636-c091-48e3-93bf-d9e659406211\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " Apr 17 08:05:32.703974 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.703958 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-tmp\") pod \"4dd7b636-c091-48e3-93bf-d9e659406211\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " Apr 17 08:05:32.704138 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.704114 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-cache\") pod \"4dd7b636-c091-48e3-93bf-d9e659406211\" (UID: \"4dd7b636-c091-48e3-93bf-d9e659406211\") " Apr 17 08:05:32.704225 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.704143 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4dd7b636-c091-48e3-93bf-d9e659406211" (UID: "4dd7b636-c091-48e3-93bf-d9e659406211"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:05:32.704308 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.704280 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4dd7b636-c091-48e3-93bf-d9e659406211" (UID: "4dd7b636-c091-48e3-93bf-d9e659406211"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:05:32.704402 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.704357 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4dd7b636-c091-48e3-93bf-d9e659406211" (UID: "4dd7b636-c091-48e3-93bf-d9e659406211"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:05:32.704547 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.704527 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-tmp\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:05:32.704591 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.704553 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-cache\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:05:32.704591 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.704568 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-tokenizer-uds\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:05:32.704717 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.704684 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4dd7b636-c091-48e3-93bf-d9e659406211" (UID: "4dd7b636-c091-48e3-93bf-d9e659406211"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:05:32.706019 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.705996 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd7b636-c091-48e3-93bf-d9e659406211-kube-api-access-mc9xc" (OuterVolumeSpecName: "kube-api-access-mc9xc") pod "4dd7b636-c091-48e3-93bf-d9e659406211" (UID: "4dd7b636-c091-48e3-93bf-d9e659406211"). InnerVolumeSpecName "kube-api-access-mc9xc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:05:32.706125 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.706106 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd7b636-c091-48e3-93bf-d9e659406211-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4dd7b636-c091-48e3-93bf-d9e659406211" (UID: "4dd7b636-c091-48e3-93bf-d9e659406211"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:05:32.805451 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.805418 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd7b636-c091-48e3-93bf-d9e659406211-tls-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:05:32.805451 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.805446 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mc9xc\" (UniqueName: \"kubernetes.io/projected/4dd7b636-c091-48e3-93bf-d9e659406211-kube-api-access-mc9xc\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:05:32.805652 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.805470 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4dd7b636-c091-48e3-93bf-d9e659406211-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:05:32.990387 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.990313 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp"] Apr 17 08:05:32.993760 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:32.993733 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-59cd4tx7dp"] Apr 17 08:05:34.808451 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:34.808416 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" path="/var/lib/kubelet/pods/4dd7b636-c091-48e3-93bf-d9e659406211/volumes" Apr 17 08:05:35.696387 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.696351 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4"] Apr 17 08:05:35.696747 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.696735 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" containerName="storage-initializer" Apr 17 08:05:35.696807 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.696749 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" containerName="storage-initializer" Apr 17 08:05:35.696807 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.696762 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" containerName="main" Apr 17 08:05:35.696807 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.696768 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" containerName="main" Apr 17 08:05:35.696807 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.696788 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" containerName="tokenizer" Apr 17 08:05:35.696807 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.696794 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" containerName="tokenizer" Apr 17 08:05:35.696957 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.696852 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" containerName="main" Apr 17 08:05:35.696957 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.696866 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4dd7b636-c091-48e3-93bf-d9e659406211" containerName="tokenizer" Apr 17 08:05:35.700278 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.700260 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.703998 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.703845 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 17 08:05:35.704142 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.704105 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:05:35.710668 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.710641 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4"] Apr 17 08:05:35.830224 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.830194 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.830660 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.830262 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.830660 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.830290 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlg8d\" (UniqueName: \"kubernetes.io/projected/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kube-api-access-mlg8d\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.830660 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.830315 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-home\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.830660 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.830444 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-dshm\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.830660 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.830513 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.931804 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.931770 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-dshm\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.931980 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.931817 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.931980 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.931856 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.931980 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.931924 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.931980 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.931954 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlg8d\" (UniqueName: \"kubernetes.io/projected/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kube-api-access-mlg8d\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.932153 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.931996 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-home\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.932310 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.932274 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.932388 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.932323 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.932388 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.932334 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-home\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.934230 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.934201 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-dshm\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.934357 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.934327 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:35.940488 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:35.940463 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlg8d\" (UniqueName: \"kubernetes.io/projected/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kube-api-access-mlg8d\") pod \"scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:36.013310 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.013219 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:36.014797 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.014762 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n"] Apr 17 08:05:36.018467 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.018446 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.021557 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.021360 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-ddw9n\"" Apr 17 08:05:36.030889 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.030866 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n"] Apr 17 08:05:36.134173 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.134120 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.134173 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.134173 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.134410 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.134226 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.134410 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.134273 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7adea4-3886-49a0-a706-af1339aea236-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.134410 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.134332 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.134410 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.134392 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94vwf\" (UniqueName: \"kubernetes.io/projected/5e7adea4-3886-49a0-a706-af1339aea236-kube-api-access-94vwf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.159792 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.159712 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4"] Apr 17 08:05:36.162110 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:05:36.162085 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ade2c3a_4e30_4cf0_a773_8106a6bf2ea8.slice/crio-2f2ae1eb03e8ee7943bdab977fb581d112209e995a1d63e4918a30b4390bad3f WatchSource:0}: Error finding container 2f2ae1eb03e8ee7943bdab977fb581d112209e995a1d63e4918a30b4390bad3f: Status 404 returned error can't find the container with id 2f2ae1eb03e8ee7943bdab977fb581d112209e995a1d63e4918a30b4390bad3f Apr 17 08:05:36.235313 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.235271 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94vwf\" (UniqueName: \"kubernetes.io/projected/5e7adea4-3886-49a0-a706-af1339aea236-kube-api-access-94vwf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.235471 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.235335 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.235471 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.235364 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.235471 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.235418 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.235471 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.235470 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7adea4-3886-49a0-a706-af1339aea236-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.235673 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.235540 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.235886 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.235794 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.235989 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.235919 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.235989 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.235975 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.236098 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.236007 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.238101 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.238082 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7adea4-3886-49a0-a706-af1339aea236-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.243882 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.243854 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94vwf\" (UniqueName: \"kubernetes.io/projected/5e7adea4-3886-49a0-a706-af1339aea236-kube-api-access-94vwf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.350947 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.350897 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:36.499313 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.499284 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n"] Apr 17 08:05:36.500814 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:05:36.500786 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e7adea4_3886_49a0_a706_af1339aea236.slice/crio-d4d7a06ab53a7c0fefd660c3204bafdf0d0fe8a91c1ede03975e26e669ad4548 WatchSource:0}: Error finding container d4d7a06ab53a7c0fefd660c3204bafdf0d0fe8a91c1ede03975e26e669ad4548: Status 404 returned error can't find the container with id d4d7a06ab53a7c0fefd660c3204bafdf0d0fe8a91c1ede03975e26e669ad4548 Apr 17 08:05:36.690256 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.690155 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" event={"ID":"5e7adea4-3886-49a0-a706-af1339aea236","Type":"ContainerStarted","Data":"4645521fb198d1739b53ce59f7eacc0234235876482b277a912715d1ac2c5d71"} Apr 17 08:05:36.690256 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.690209 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" event={"ID":"5e7adea4-3886-49a0-a706-af1339aea236","Type":"ContainerStarted","Data":"d4d7a06ab53a7c0fefd660c3204bafdf0d0fe8a91c1ede03975e26e669ad4548"} Apr 17 08:05:36.691670 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.691641 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" event={"ID":"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8","Type":"ContainerStarted","Data":"136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2"} Apr 17 08:05:36.691814 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:36.691674 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" event={"ID":"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8","Type":"ContainerStarted","Data":"2f2ae1eb03e8ee7943bdab977fb581d112209e995a1d63e4918a30b4390bad3f"} Apr 17 08:05:37.696887 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:37.696840 2566 generic.go:358] "Generic (PLEG): container finished" podID="5e7adea4-3886-49a0-a706-af1339aea236" containerID="4645521fb198d1739b53ce59f7eacc0234235876482b277a912715d1ac2c5d71" exitCode=0 Apr 17 08:05:37.697272 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:37.696927 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" event={"ID":"5e7adea4-3886-49a0-a706-af1339aea236","Type":"ContainerDied","Data":"4645521fb198d1739b53ce59f7eacc0234235876482b277a912715d1ac2c5d71"} Apr 17 08:05:38.704479 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:38.704439 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" event={"ID":"5e7adea4-3886-49a0-a706-af1339aea236","Type":"ContainerStarted","Data":"bf947d9cef1446ab4342814718721bee4c29359dec44cc1df5f4aa097b308c59"} Apr 17 08:05:38.704891 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:38.704487 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" event={"ID":"5e7adea4-3886-49a0-a706-af1339aea236","Type":"ContainerStarted","Data":"65eb2654987224e1f9321acbaf61800b07a361e3687223b324bb7898f0e8596e"} Apr 17 08:05:38.704891 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:38.704577 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:38.727631 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:38.727568 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" podStartSLOduration=3.7275485 podStartE2EDuration="3.7275485s" podCreationTimestamp="2026-04-17 08:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:05:38.725100166 +0000 UTC m=+842.512878773" watchObservedRunningTime="2026-04-17 08:05:38.7275485 +0000 UTC m=+842.515327104" Apr 17 08:05:40.714890 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:40.714853 2566 generic.go:358] "Generic (PLEG): container finished" podID="8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" containerID="136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2" exitCode=0 Apr 17 08:05:40.715247 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:40.714924 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" event={"ID":"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8","Type":"ContainerDied","Data":"136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2"} Apr 17 08:05:42.728794 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:42.728750 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" event={"ID":"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8","Type":"ContainerStarted","Data":"61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541"} Apr 17 08:05:42.748859 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:42.748767 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" podStartSLOduration=6.643013525 podStartE2EDuration="7.748748068s" podCreationTimestamp="2026-04-17 08:05:35 +0000 UTC" firstStartedPulling="2026-04-17 08:05:40.716041368 +0000 UTC m=+844.503819954" lastFinishedPulling="2026-04-17 08:05:41.821775917 +0000 UTC m=+845.609554497" observedRunningTime="2026-04-17 08:05:42.747834641 +0000 UTC m=+846.535613265" watchObservedRunningTime="2026-04-17 08:05:42.748748068 +0000 UTC m=+846.536526672" Apr 17 08:05:46.013674 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:46.013624 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:46.014231 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:46.013714 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:46.026254 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:46.026223 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:05:46.351564 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:46.351512 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:46.351564 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:46.351574 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:46.354492 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:46.354465 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:46.746386 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:46.746297 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:05:46.756648 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:05:46.756623 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:06:07.750936 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:07.750901 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:06:08.509784 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.509746 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n"] Apr 17 08:06:08.510195 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.510160 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" podUID="5e7adea4-3886-49a0-a706-af1339aea236" containerName="main" containerID="cri-o://65eb2654987224e1f9321acbaf61800b07a361e3687223b324bb7898f0e8596e" gracePeriod=30 Apr 17 08:06:08.510330 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.510219 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" podUID="5e7adea4-3886-49a0-a706-af1339aea236" containerName="tokenizer" containerID="cri-o://bf947d9cef1446ab4342814718721bee4c29359dec44cc1df5f4aa097b308c59" gracePeriod=30 Apr 17 08:06:08.519223 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.519193 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4"] Apr 17 08:06:08.519537 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.519510 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" podUID="8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" containerName="main" containerID="cri-o://61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541" gracePeriod=30 Apr 17 08:06:08.790851 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.790827 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:06:08.841588 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.841547 2566 generic.go:358] "Generic (PLEG): container finished" podID="5e7adea4-3886-49a0-a706-af1339aea236" containerID="65eb2654987224e1f9321acbaf61800b07a361e3687223b324bb7898f0e8596e" exitCode=0 Apr 17 08:06:08.841831 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.841631 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" event={"ID":"5e7adea4-3886-49a0-a706-af1339aea236","Type":"ContainerDied","Data":"65eb2654987224e1f9321acbaf61800b07a361e3687223b324bb7898f0e8596e"} Apr 17 08:06:08.843158 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.843129 2566 generic.go:358] "Generic (PLEG): container finished" podID="8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" containerID="61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541" exitCode=0 Apr 17 08:06:08.843276 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.843214 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" Apr 17 08:06:08.843276 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.843213 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" event={"ID":"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8","Type":"ContainerDied","Data":"61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541"} Apr 17 08:06:08.843346 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.843295 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4" event={"ID":"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8","Type":"ContainerDied","Data":"2f2ae1eb03e8ee7943bdab977fb581d112209e995a1d63e4918a30b4390bad3f"} Apr 17 08:06:08.843346 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.843314 2566 scope.go:117] "RemoveContainer" containerID="61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541" Apr 17 08:06:08.852250 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.852232 2566 scope.go:117] "RemoveContainer" containerID="136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2" Apr 17 08:06:08.915823 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.915796 2566 scope.go:117] "RemoveContainer" containerID="61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541" Apr 17 08:06:08.916167 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:06:08.916146 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541\": container with ID starting with 61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541 not found: ID does not exist" containerID="61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541" Apr 17 08:06:08.916225 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.916180 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541"} err="failed to get container status \"61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541\": rpc error: code = NotFound desc = could not find container \"61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541\": container with ID starting with 61a715eb70f6033ddf0e0e5817f5dfbcbd7bb555cc917730395c4015beb21541 not found: ID does not exist" Apr 17 08:06:08.916225 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.916199 2566 scope.go:117] "RemoveContainer" containerID="136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2" Apr 17 08:06:08.916493 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:06:08.916479 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2\": container with ID starting with 136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2 not found: ID does not exist" containerID="136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2" Apr 17 08:06:08.916565 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.916497 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2"} err="failed to get container status \"136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2\": rpc error: code = NotFound desc = could not find container \"136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2\": container with ID starting with 136d47913025c34628246b3492d8d2aaeba5a084b7f8b3523c16387d2d872aa2 not found: ID does not exist" Apr 17 08:06:08.948935 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.948892 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlg8d\" (UniqueName: \"kubernetes.io/projected/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kube-api-access-mlg8d\") pod \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " Apr 17 08:06:08.949100 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.948964 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-model-cache\") pod \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " Apr 17 08:06:08.949100 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.948990 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kserve-provision-location\") pod \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " Apr 17 08:06:08.949100 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.949026 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-tls-certs\") pod \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " Apr 17 08:06:08.949100 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.949070 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-dshm\") pod \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " Apr 17 08:06:08.949100 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.949099 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-home\") pod \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\" (UID: \"8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8\") " Apr 17 08:06:08.949358 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.949268 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-model-cache" (OuterVolumeSpecName: "model-cache") pod "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" (UID: "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:08.949420 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.949400 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-model-cache\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:08.949504 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.949482 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-home" (OuterVolumeSpecName: "home") pod "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" (UID: "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:08.951282 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.951257 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" (UID: "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:08.951440 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.951418 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-dshm" (OuterVolumeSpecName: "dshm") pod "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" (UID: "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:08.951555 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:08.951541 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kube-api-access-mlg8d" (OuterVolumeSpecName: "kube-api-access-mlg8d") pod "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" (UID: "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8"). InnerVolumeSpecName "kube-api-access-mlg8d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:09.004473 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.004435 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" (UID: "8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:09.049964 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.049924 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:09.049964 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.049958 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-tls-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:09.049964 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.049968 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-dshm\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:09.049964 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.049976 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-home\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:09.050242 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.049985 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mlg8d\" (UniqueName: \"kubernetes.io/projected/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8-kube-api-access-mlg8d\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:09.164837 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.164805 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4"] Apr 17 08:06:09.168303 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.168266 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-64dc69f7f7-lzjz4"] Apr 17 08:06:09.849712 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.849611 2566 generic.go:358] "Generic (PLEG): container finished" podID="5e7adea4-3886-49a0-a706-af1339aea236" containerID="bf947d9cef1446ab4342814718721bee4c29359dec44cc1df5f4aa097b308c59" exitCode=0 Apr 17 08:06:09.850134 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.849687 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" event={"ID":"5e7adea4-3886-49a0-a706-af1339aea236","Type":"ContainerDied","Data":"bf947d9cef1446ab4342814718721bee4c29359dec44cc1df5f4aa097b308c59"} Apr 17 08:06:09.878114 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.878090 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:06:09.959878 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.959837 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94vwf\" (UniqueName: \"kubernetes.io/projected/5e7adea4-3886-49a0-a706-af1339aea236-kube-api-access-94vwf\") pod \"5e7adea4-3886-49a0-a706-af1339aea236\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " Apr 17 08:06:09.959878 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.959878 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7adea4-3886-49a0-a706-af1339aea236-tls-certs\") pod \"5e7adea4-3886-49a0-a706-af1339aea236\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " Apr 17 08:06:09.960126 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.959968 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-cache\") pod \"5e7adea4-3886-49a0-a706-af1339aea236\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " Apr 17 08:06:09.960126 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.960008 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-uds\") pod \"5e7adea4-3886-49a0-a706-af1339aea236\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " Apr 17 08:06:09.960126 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.960027 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-tmp\") pod \"5e7adea4-3886-49a0-a706-af1339aea236\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " Apr 17 08:06:09.960126 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.960046 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-kserve-provision-location\") pod \"5e7adea4-3886-49a0-a706-af1339aea236\" (UID: \"5e7adea4-3886-49a0-a706-af1339aea236\") " Apr 17 08:06:09.960356 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.960319 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5e7adea4-3886-49a0-a706-af1339aea236" (UID: "5e7adea4-3886-49a0-a706-af1339aea236"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:09.960421 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.960349 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5e7adea4-3886-49a0-a706-af1339aea236" (UID: "5e7adea4-3886-49a0-a706-af1339aea236"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:09.960421 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.960337 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5e7adea4-3886-49a0-a706-af1339aea236" (UID: "5e7adea4-3886-49a0-a706-af1339aea236"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:09.960839 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.960818 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e7adea4-3886-49a0-a706-af1339aea236" (UID: "5e7adea4-3886-49a0-a706-af1339aea236"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:09.962091 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.962070 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e7adea4-3886-49a0-a706-af1339aea236-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5e7adea4-3886-49a0-a706-af1339aea236" (UID: "5e7adea4-3886-49a0-a706-af1339aea236"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:09.962173 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:09.962094 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e7adea4-3886-49a0-a706-af1339aea236-kube-api-access-94vwf" (OuterVolumeSpecName: "kube-api-access-94vwf") pod "5e7adea4-3886-49a0-a706-af1339aea236" (UID: "5e7adea4-3886-49a0-a706-af1339aea236"). InnerVolumeSpecName "kube-api-access-94vwf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:10.061551 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.061494 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-cache\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:10.061551 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.061544 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-uds\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:10.061551 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.061555 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-tokenizer-tmp\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:10.061551 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.061565 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e7adea4-3886-49a0-a706-af1339aea236-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:10.061864 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.061575 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94vwf\" (UniqueName: \"kubernetes.io/projected/5e7adea4-3886-49a0-a706-af1339aea236-kube-api-access-94vwf\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:10.061864 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.061585 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7adea4-3886-49a0-a706-af1339aea236-tls-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:06:10.808509 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.808462 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" path="/var/lib/kubelet/pods/8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8/volumes" Apr 17 08:06:10.856223 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.856190 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" event={"ID":"5e7adea4-3886-49a0-a706-af1339aea236","Type":"ContainerDied","Data":"d4d7a06ab53a7c0fefd660c3204bafdf0d0fe8a91c1ede03975e26e669ad4548"} Apr 17 08:06:10.856223 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.856219 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n" Apr 17 08:06:10.856650 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.856242 2566 scope.go:117] "RemoveContainer" containerID="bf947d9cef1446ab4342814718721bee4c29359dec44cc1df5f4aa097b308c59" Apr 17 08:06:10.864469 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.864446 2566 scope.go:117] "RemoveContainer" containerID="65eb2654987224e1f9321acbaf61800b07a361e3687223b324bb7898f0e8596e" Apr 17 08:06:10.872046 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.871978 2566 scope.go:117] "RemoveContainer" containerID="4645521fb198d1739b53ce59f7eacc0234235876482b277a912715d1ac2c5d71" Apr 17 08:06:10.875111 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.875088 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n"] Apr 17 08:06:10.878688 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:10.878662 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5c9568587l8n"] Apr 17 08:06:12.808496 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:12.808461 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e7adea4-3886-49a0-a706-af1339aea236" path="/var/lib/kubelet/pods/5e7adea4-3886-49a0-a706-af1339aea236/volumes" Apr 17 08:06:27.797394 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797357 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls"] Apr 17 08:06:27.797928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797822 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e7adea4-3886-49a0-a706-af1339aea236" containerName="storage-initializer" Apr 17 08:06:27.797928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797836 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7adea4-3886-49a0-a706-af1339aea236" containerName="storage-initializer" Apr 17 08:06:27.797928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797845 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e7adea4-3886-49a0-a706-af1339aea236" containerName="main" Apr 17 08:06:27.797928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797851 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7adea4-3886-49a0-a706-af1339aea236" containerName="main" Apr 17 08:06:27.797928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797872 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" containerName="storage-initializer" Apr 17 08:06:27.797928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797881 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" containerName="storage-initializer" Apr 17 08:06:27.797928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797901 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" containerName="main" Apr 17 08:06:27.797928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797910 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" containerName="main" Apr 17 08:06:27.797928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797920 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e7adea4-3886-49a0-a706-af1339aea236" containerName="tokenizer" Apr 17 08:06:27.797928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.797930 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7adea4-3886-49a0-a706-af1339aea236" containerName="tokenizer" Apr 17 08:06:27.798555 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.798027 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e7adea4-3886-49a0-a706-af1339aea236" containerName="tokenizer" Apr 17 08:06:27.798555 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.798044 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e7adea4-3886-49a0-a706-af1339aea236" containerName="main" Apr 17 08:06:27.798555 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.798055 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ade2c3a-4e30-4cf0-a773-8106a6bf2ea8" containerName="main" Apr 17 08:06:27.801672 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.801651 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.804469 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.804439 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:06:27.805559 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.804793 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 17 08:06:27.813226 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.812943 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls"] Apr 17 08:06:27.824927 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.824894 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-model-cache\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.825092 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.824955 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ead1cb28-0293-45e6-b6cf-5badcf7a401a-tls-certs\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.825092 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.824986 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-home\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.825092 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.825009 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-dshm\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.825092 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.825036 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.825309 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.825117 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpwxh\" (UniqueName: \"kubernetes.io/projected/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kube-api-access-tpwxh\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.925889 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.925856 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-model-cache\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.926106 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.925904 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ead1cb28-0293-45e6-b6cf-5badcf7a401a-tls-certs\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.926106 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.925926 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-home\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.926106 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.925944 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-dshm\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.926106 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.925972 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.926106 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.926024 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpwxh\" (UniqueName: \"kubernetes.io/projected/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kube-api-access-tpwxh\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.926356 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.926332 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-model-cache\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.926420 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.926380 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-home\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.926474 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.926446 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.928323 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.928291 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-dshm\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.928515 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.928499 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ead1cb28-0293-45e6-b6cf-5badcf7a401a-tls-certs\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:27.933936 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:27.933916 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpwxh\" (UniqueName: \"kubernetes.io/projected/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kube-api-access-tpwxh\") pod \"precise-prefix-cache-test-kserve-c44d4d4cc-qfwls\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:28.046764 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.046715 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg"] Apr 17 08:06:28.050822 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.050772 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.053881 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.053855 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-5cd7p\"" Apr 17 08:06:28.063828 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.063796 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg"] Apr 17 08:06:28.116540 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.116497 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:28.127833 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.127793 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.127991 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.127842 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.127991 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.127874 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljrtj\" (UniqueName: \"kubernetes.io/projected/43061f9b-79d8-43d6-9e00-80f818737384-kube-api-access-ljrtj\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.127991 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.127944 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.127991 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.127972 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43061f9b-79d8-43d6-9e00-80f818737384-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.128222 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.128074 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.229590 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.229557 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.229798 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.229604 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43061f9b-79d8-43d6-9e00-80f818737384-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.229798 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.229663 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.229798 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.229732 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.229798 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.229770 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.229798 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.229798 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljrtj\" (UniqueName: \"kubernetes.io/projected/43061f9b-79d8-43d6-9e00-80f818737384-kube-api-access-ljrtj\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.230073 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.230034 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.230327 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.230303 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.230434 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.230340 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.230542 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.230524 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.232591 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.232565 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43061f9b-79d8-43d6-9e00-80f818737384-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.238843 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.238816 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljrtj\" (UniqueName: \"kubernetes.io/projected/43061f9b-79d8-43d6-9e00-80f818737384-kube-api-access-ljrtj\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.252960 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.252931 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls"] Apr 17 08:06:28.255135 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:06:28.255108 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podead1cb28_0293_45e6_b6cf_5badcf7a401a.slice/crio-c3b4c9b6146059cab1be31702a60dae03e74cb7f9f787a4eb40d64725d6bc8fc WatchSource:0}: Error finding container c3b4c9b6146059cab1be31702a60dae03e74cb7f9f787a4eb40d64725d6bc8fc: Status 404 returned error can't find the container with id c3b4c9b6146059cab1be31702a60dae03e74cb7f9f787a4eb40d64725d6bc8fc Apr 17 08:06:28.363048 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.363012 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:28.497761 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.497690 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg"] Apr 17 08:06:28.501856 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:06:28.501827 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43061f9b_79d8_43d6_9e00_80f818737384.slice/crio-e551c8593a6f66c10a922f958ff1a550797f53d0727577478e95b710b08eca6b WatchSource:0}: Error finding container e551c8593a6f66c10a922f958ff1a550797f53d0727577478e95b710b08eca6b: Status 404 returned error can't find the container with id e551c8593a6f66c10a922f958ff1a550797f53d0727577478e95b710b08eca6b Apr 17 08:06:28.932304 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.932193 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" event={"ID":"ead1cb28-0293-45e6-b6cf-5badcf7a401a","Type":"ContainerStarted","Data":"9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774"} Apr 17 08:06:28.932304 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.932250 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" event={"ID":"ead1cb28-0293-45e6-b6cf-5badcf7a401a","Type":"ContainerStarted","Data":"c3b4c9b6146059cab1be31702a60dae03e74cb7f9f787a4eb40d64725d6bc8fc"} Apr 17 08:06:28.934434 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.934406 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" event={"ID":"43061f9b-79d8-43d6-9e00-80f818737384","Type":"ContainerStarted","Data":"af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a"} Apr 17 08:06:28.934570 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:28.934440 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" event={"ID":"43061f9b-79d8-43d6-9e00-80f818737384","Type":"ContainerStarted","Data":"e551c8593a6f66c10a922f958ff1a550797f53d0727577478e95b710b08eca6b"} Apr 17 08:06:29.941573 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:29.941539 2566 generic.go:358] "Generic (PLEG): container finished" podID="43061f9b-79d8-43d6-9e00-80f818737384" containerID="af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a" exitCode=0 Apr 17 08:06:29.942072 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:29.941638 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" event={"ID":"43061f9b-79d8-43d6-9e00-80f818737384","Type":"ContainerDied","Data":"af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a"} Apr 17 08:06:30.949845 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:30.949761 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" event={"ID":"43061f9b-79d8-43d6-9e00-80f818737384","Type":"ContainerStarted","Data":"de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0"} Apr 17 08:06:30.949845 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:30.949804 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" event={"ID":"43061f9b-79d8-43d6-9e00-80f818737384","Type":"ContainerStarted","Data":"a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32"} Apr 17 08:06:30.950606 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:30.950108 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:30.970399 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:30.970338 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" podStartSLOduration=2.970323382 podStartE2EDuration="2.970323382s" podCreationTimestamp="2026-04-17 08:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:06:30.968730111 +0000 UTC m=+894.756508714" watchObservedRunningTime="2026-04-17 08:06:30.970323382 +0000 UTC m=+894.758101984" Apr 17 08:06:32.959044 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:32.958998 2566 generic.go:358] "Generic (PLEG): container finished" podID="ead1cb28-0293-45e6-b6cf-5badcf7a401a" containerID="9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774" exitCode=0 Apr 17 08:06:32.959520 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:32.959070 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" event={"ID":"ead1cb28-0293-45e6-b6cf-5badcf7a401a","Type":"ContainerDied","Data":"9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774"} Apr 17 08:06:33.964840 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:33.964800 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" event={"ID":"ead1cb28-0293-45e6-b6cf-5badcf7a401a","Type":"ContainerStarted","Data":"8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae"} Apr 17 08:06:33.983222 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:33.983171 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" podStartSLOduration=6.983155085 podStartE2EDuration="6.983155085s" podCreationTimestamp="2026-04-17 08:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:06:33.981814534 +0000 UTC m=+897.769593140" watchObservedRunningTime="2026-04-17 08:06:33.983155085 +0000 UTC m=+897.770933696" Apr 17 08:06:36.770176 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:36.770146 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:06:36.773440 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:36.773417 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:06:36.775904 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:36.775878 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:06:36.779302 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:36.779273 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:06:38.117234 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:38.117192 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:38.117624 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:38.117250 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:38.130234 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:38.130208 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:38.363548 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:38.363510 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:38.363766 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:38.363741 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:38.364816 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:06:38.364796 2566 logging.go:55] [core] [Channel #53 SubChannel #54]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.43:9003", ServerName: "10.132.0.43:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.43:9003: connect: connection refused" Apr 17 08:06:38.366125 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:38.366105 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:38.990089 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:38.990055 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:06:39.000284 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:39.000259 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:06:39.364757 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:39.364689 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.43:9003\" within 1s: context deadline exceeded" Apr 17 08:06:48.363961 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:06:48.363927 2566 logging.go:55] [core] [Channel #55 SubChannel #56]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.43:9003", ServerName: "10.132.0.43:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.43:9003: connect: connection refused" Apr 17 08:06:49.363631 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:06:49.363584 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.43:9003\" within 1s: context deadline exceeded" Apr 17 08:07:00.997766 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:00.997736 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:07:02.431409 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.431369 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg"] Apr 17 08:07:02.432330 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.432282 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="main" containerID="cri-o://a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32" gracePeriod=30 Apr 17 08:07:02.432846 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.432788 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="tokenizer" containerID="cri-o://de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0" gracePeriod=30 Apr 17 08:07:02.442079 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.442054 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls"] Apr 17 08:07:02.442401 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.442374 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" podUID="ead1cb28-0293-45e6-b6cf-5badcf7a401a" containerName="main" containerID="cri-o://8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae" gracePeriod=30 Apr 17 08:07:02.699500 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.699477 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:07:02.773208 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.773169 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-home\") pod \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " Apr 17 08:07:02.773368 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.773226 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kserve-provision-location\") pod \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " Apr 17 08:07:02.773368 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.773262 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-dshm\") pod \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " Apr 17 08:07:02.773368 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.773290 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ead1cb28-0293-45e6-b6cf-5badcf7a401a-tls-certs\") pod \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " Apr 17 08:07:02.773368 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.773350 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpwxh\" (UniqueName: \"kubernetes.io/projected/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kube-api-access-tpwxh\") pod \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " Apr 17 08:07:02.773593 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.773437 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-model-cache\") pod \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\" (UID: \"ead1cb28-0293-45e6-b6cf-5badcf7a401a\") " Apr 17 08:07:02.773593 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.773470 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-home" (OuterVolumeSpecName: "home") pod "ead1cb28-0293-45e6-b6cf-5badcf7a401a" (UID: "ead1cb28-0293-45e6-b6cf-5badcf7a401a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:02.773784 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.773763 2566 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-home\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:02.773876 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.773849 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-model-cache" (OuterVolumeSpecName: "model-cache") pod "ead1cb28-0293-45e6-b6cf-5badcf7a401a" (UID: "ead1cb28-0293-45e6-b6cf-5badcf7a401a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:02.775506 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.775473 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-dshm" (OuterVolumeSpecName: "dshm") pod "ead1cb28-0293-45e6-b6cf-5badcf7a401a" (UID: "ead1cb28-0293-45e6-b6cf-5badcf7a401a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:02.775823 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.775783 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead1cb28-0293-45e6-b6cf-5badcf7a401a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ead1cb28-0293-45e6-b6cf-5badcf7a401a" (UID: "ead1cb28-0293-45e6-b6cf-5badcf7a401a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:07:02.776275 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.776249 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kube-api-access-tpwxh" (OuterVolumeSpecName: "kube-api-access-tpwxh") pod "ead1cb28-0293-45e6-b6cf-5badcf7a401a" (UID: "ead1cb28-0293-45e6-b6cf-5badcf7a401a"). InnerVolumeSpecName "kube-api-access-tpwxh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:07:02.829473 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.829410 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ead1cb28-0293-45e6-b6cf-5badcf7a401a" (UID: "ead1cb28-0293-45e6-b6cf-5badcf7a401a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:02.874925 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.874886 2566 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-model-cache\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:02.874925 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.874925 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:02.875135 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.874941 2566 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ead1cb28-0293-45e6-b6cf-5badcf7a401a-dshm\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:02.875135 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.874956 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ead1cb28-0293-45e6-b6cf-5badcf7a401a-tls-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:02.875135 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:02.874970 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tpwxh\" (UniqueName: \"kubernetes.io/projected/ead1cb28-0293-45e6-b6cf-5badcf7a401a-kube-api-access-tpwxh\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:03.077455 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.077422 2566 generic.go:358] "Generic (PLEG): container finished" podID="ead1cb28-0293-45e6-b6cf-5badcf7a401a" containerID="8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae" exitCode=0 Apr 17 08:07:03.077637 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.077500 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" Apr 17 08:07:03.077637 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.077505 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" event={"ID":"ead1cb28-0293-45e6-b6cf-5badcf7a401a","Type":"ContainerDied","Data":"8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae"} Apr 17 08:07:03.077637 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.077542 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls" event={"ID":"ead1cb28-0293-45e6-b6cf-5badcf7a401a","Type":"ContainerDied","Data":"c3b4c9b6146059cab1be31702a60dae03e74cb7f9f787a4eb40d64725d6bc8fc"} Apr 17 08:07:03.077637 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.077561 2566 scope.go:117] "RemoveContainer" containerID="8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae" Apr 17 08:07:03.079778 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.079751 2566 generic.go:358] "Generic (PLEG): container finished" podID="43061f9b-79d8-43d6-9e00-80f818737384" containerID="a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32" exitCode=0 Apr 17 08:07:03.079906 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.079804 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" event={"ID":"43061f9b-79d8-43d6-9e00-80f818737384","Type":"ContainerDied","Data":"a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32"} Apr 17 08:07:03.087141 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.087121 2566 scope.go:117] "RemoveContainer" containerID="9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774" Apr 17 08:07:03.097472 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.097450 2566 scope.go:117] "RemoveContainer" containerID="8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae" Apr 17 08:07:03.098133 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:07:03.097944 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae\": container with ID starting with 8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae not found: ID does not exist" containerID="8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae" Apr 17 08:07:03.098133 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.097993 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae"} err="failed to get container status \"8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae\": rpc error: code = NotFound desc = could not find container \"8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae\": container with ID starting with 8966cabbd568bc199d30d4da0e471a8550e3c6bdf10d3bdd9f5edb640e79cbae not found: ID does not exist" Apr 17 08:07:03.098133 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.098020 2566 scope.go:117] "RemoveContainer" containerID="9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774" Apr 17 08:07:03.098376 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:07:03.098343 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774\": container with ID starting with 9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774 not found: ID does not exist" containerID="9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774" Apr 17 08:07:03.098433 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.098374 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774"} err="failed to get container status \"9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774\": rpc error: code = NotFound desc = could not find container \"9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774\": container with ID starting with 9296ab5b184307ffd0b687aa18c950d659d08dab875ad973ce937d80105d5774 not found: ID does not exist" Apr 17 08:07:03.100006 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.099979 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls"] Apr 17 08:07:03.106887 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:03.106862 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-c44d4d4cc-qfwls"] Apr 17 08:07:04.001947 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.001922 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:07:04.085756 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.085651 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-cache\") pod \"43061f9b-79d8-43d6-9e00-80f818737384\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " Apr 17 08:07:04.085756 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.085688 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43061f9b-79d8-43d6-9e00-80f818737384-tls-certs\") pod \"43061f9b-79d8-43d6-9e00-80f818737384\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " Apr 17 08:07:04.085756 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.085729 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-uds\") pod \"43061f9b-79d8-43d6-9e00-80f818737384\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " Apr 17 08:07:04.086021 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.085800 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljrtj\" (UniqueName: \"kubernetes.io/projected/43061f9b-79d8-43d6-9e00-80f818737384-kube-api-access-ljrtj\") pod \"43061f9b-79d8-43d6-9e00-80f818737384\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " Apr 17 08:07:04.086021 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.085839 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-tmp\") pod \"43061f9b-79d8-43d6-9e00-80f818737384\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " Apr 17 08:07:04.086021 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.085872 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-kserve-provision-location\") pod \"43061f9b-79d8-43d6-9e00-80f818737384\" (UID: \"43061f9b-79d8-43d6-9e00-80f818737384\") " Apr 17 08:07:04.086175 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086019 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "43061f9b-79d8-43d6-9e00-80f818737384" (UID: "43061f9b-79d8-43d6-9e00-80f818737384"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:04.086175 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086118 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "43061f9b-79d8-43d6-9e00-80f818737384" (UID: "43061f9b-79d8-43d6-9e00-80f818737384"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:04.086272 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086245 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "43061f9b-79d8-43d6-9e00-80f818737384" (UID: "43061f9b-79d8-43d6-9e00-80f818737384"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:04.086440 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086421 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-tmp\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:04.086503 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086447 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-cache\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:04.086503 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086472 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-tokenizer-uds\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:04.086744 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086721 2566 generic.go:358] "Generic (PLEG): container finished" podID="43061f9b-79d8-43d6-9e00-80f818737384" containerID="de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0" exitCode=0 Apr 17 08:07:04.086829 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086752 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" event={"ID":"43061f9b-79d8-43d6-9e00-80f818737384","Type":"ContainerDied","Data":"de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0"} Apr 17 08:07:04.086829 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086777 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" event={"ID":"43061f9b-79d8-43d6-9e00-80f818737384","Type":"ContainerDied","Data":"e551c8593a6f66c10a922f958ff1a550797f53d0727577478e95b710b08eca6b"} Apr 17 08:07:04.086829 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086793 2566 scope.go:117] "RemoveContainer" containerID="de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0" Apr 17 08:07:04.086829 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086819 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg" Apr 17 08:07:04.087038 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.086820 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "43061f9b-79d8-43d6-9e00-80f818737384" (UID: "43061f9b-79d8-43d6-9e00-80f818737384"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:04.088198 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.088174 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43061f9b-79d8-43d6-9e00-80f818737384-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "43061f9b-79d8-43d6-9e00-80f818737384" (UID: "43061f9b-79d8-43d6-9e00-80f818737384"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:07:04.088319 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.088300 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43061f9b-79d8-43d6-9e00-80f818737384-kube-api-access-ljrtj" (OuterVolumeSpecName: "kube-api-access-ljrtj") pod "43061f9b-79d8-43d6-9e00-80f818737384" (UID: "43061f9b-79d8-43d6-9e00-80f818737384"). InnerVolumeSpecName "kube-api-access-ljrtj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:07:04.100575 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.100551 2566 scope.go:117] "RemoveContainer" containerID="a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32" Apr 17 08:07:04.108252 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.108236 2566 scope.go:117] "RemoveContainer" containerID="af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a" Apr 17 08:07:04.115470 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.115454 2566 scope.go:117] "RemoveContainer" containerID="de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0" Apr 17 08:07:04.115737 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:07:04.115717 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0\": container with ID starting with de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0 not found: ID does not exist" containerID="de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0" Apr 17 08:07:04.115782 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.115749 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0"} err="failed to get container status \"de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0\": rpc error: code = NotFound desc = could not find container \"de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0\": container with ID starting with de528438b29bd5e57bb97cdae3b541bb84fadc29ad76d2ea681f467ac600ade0 not found: ID does not exist" Apr 17 08:07:04.115782 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.115769 2566 scope.go:117] "RemoveContainer" containerID="a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32" Apr 17 08:07:04.116012 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:07:04.115990 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32\": container with ID starting with a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32 not found: ID does not exist" containerID="a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32" Apr 17 08:07:04.116078 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.116022 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32"} err="failed to get container status \"a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32\": rpc error: code = NotFound desc = could not find container \"a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32\": container with ID starting with a75203138f55ebabee1dcd9604a8a60915b4c66a91a8ee6e5eed73ec35782d32 not found: ID does not exist" Apr 17 08:07:04.116078 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.116047 2566 scope.go:117] "RemoveContainer" containerID="af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a" Apr 17 08:07:04.116285 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:07:04.116267 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a\": container with ID starting with af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a not found: ID does not exist" containerID="af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a" Apr 17 08:07:04.116336 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.116292 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a"} err="failed to get container status \"af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a\": rpc error: code = NotFound desc = could not find container \"af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a\": container with ID starting with af78f5a6eb4646133e405aa95a7faa3b30ac1253a5eaf9990db926a33754f44a not found: ID does not exist" Apr 17 08:07:04.187328 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.187293 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ljrtj\" (UniqueName: \"kubernetes.io/projected/43061f9b-79d8-43d6-9e00-80f818737384-kube-api-access-ljrtj\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:04.187328 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.187321 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43061f9b-79d8-43d6-9e00-80f818737384-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:04.187328 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.187333 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43061f9b-79d8-43d6-9e00-80f818737384-tls-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:07:04.409214 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.409183 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg"] Apr 17 08:07:04.413466 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.413438 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6f9ff6d7b8dgg"] Apr 17 08:07:04.810788 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.810757 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43061f9b-79d8-43d6-9e00-80f818737384" path="/var/lib/kubelet/pods/43061f9b-79d8-43d6-9e00-80f818737384/volumes" Apr 17 08:07:04.811260 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:04.811247 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ead1cb28-0293-45e6-b6cf-5badcf7a401a" path="/var/lib/kubelet/pods/ead1cb28-0293-45e6-b6cf-5badcf7a401a/volumes" Apr 17 08:07:22.229367 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.229330 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v"] Apr 17 08:07:22.229941 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.229924 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ead1cb28-0293-45e6-b6cf-5badcf7a401a" containerName="main" Apr 17 08:07:22.230021 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.229944 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead1cb28-0293-45e6-b6cf-5badcf7a401a" containerName="main" Apr 17 08:07:22.230021 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.229959 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="storage-initializer" Apr 17 08:07:22.230021 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.229967 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="storage-initializer" Apr 17 08:07:22.230021 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.229984 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="tokenizer" Apr 17 08:07:22.230021 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.229992 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="tokenizer" Apr 17 08:07:22.230298 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.230028 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="main" Apr 17 08:07:22.230298 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.230037 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="main" Apr 17 08:07:22.230298 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.230049 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ead1cb28-0293-45e6-b6cf-5badcf7a401a" containerName="storage-initializer" Apr 17 08:07:22.230298 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.230058 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead1cb28-0293-45e6-b6cf-5badcf7a401a" containerName="storage-initializer" Apr 17 08:07:22.230298 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.230134 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="tokenizer" Apr 17 08:07:22.230298 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.230146 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ead1cb28-0293-45e6-b6cf-5badcf7a401a" containerName="main" Apr 17 08:07:22.230298 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.230159 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="43061f9b-79d8-43d6-9e00-80f818737384" containerName="main" Apr 17 08:07:22.233952 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.233930 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.236553 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.236530 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:07:22.237498 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.237466 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-m4ndp\"" Apr 17 08:07:22.237728 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.237690 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 17 08:07:22.243766 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.243744 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v"] Apr 17 08:07:22.355687 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.355650 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.355883 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.355692 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.355883 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.355765 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.355883 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.355823 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.355883 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.355871 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.356022 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.355914 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xg59\" (UniqueName: \"kubernetes.io/projected/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kube-api-access-9xg59\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.456514 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.456470 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.456735 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.456611 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.456735 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.456666 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xg59\" (UniqueName: \"kubernetes.io/projected/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kube-api-access-9xg59\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.456886 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.456772 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.456886 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.456806 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.456886 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.456856 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.457343 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.457302 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.457659 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.457629 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.458066 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.457916 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.458066 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.458015 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.460504 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.460481 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.464954 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.464920 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xg59\" (UniqueName: \"kubernetes.io/projected/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kube-api-access-9xg59\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.545945 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.545872 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:22.677509 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:22.677474 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v"] Apr 17 08:07:22.682579 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:07:22.682543 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podead8cd33_bbc9_4f9a_ad0b_caba8e456a7d.slice/crio-fc258429afeaa7f95de9b9e2504ebbfba0fa54fa390a68dec2403987064cb59c WatchSource:0}: Error finding container fc258429afeaa7f95de9b9e2504ebbfba0fa54fa390a68dec2403987064cb59c: Status 404 returned error can't find the container with id fc258429afeaa7f95de9b9e2504ebbfba0fa54fa390a68dec2403987064cb59c Apr 17 08:07:23.157386 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:23.157352 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" event={"ID":"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d","Type":"ContainerStarted","Data":"6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607"} Apr 17 08:07:23.157386 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:23.157390 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" event={"ID":"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d","Type":"ContainerStarted","Data":"fc258429afeaa7f95de9b9e2504ebbfba0fa54fa390a68dec2403987064cb59c"} Apr 17 08:07:24.161965 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:24.161926 2566 generic.go:358] "Generic (PLEG): container finished" podID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerID="6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607" exitCode=0 Apr 17 08:07:24.162385 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:24.162041 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" event={"ID":"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d","Type":"ContainerDied","Data":"6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607"} Apr 17 08:07:25.167666 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:25.167622 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" event={"ID":"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d","Type":"ContainerStarted","Data":"5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38"} Apr 17 08:07:25.167666 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:25.167666 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" event={"ID":"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d","Type":"ContainerStarted","Data":"bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718"} Apr 17 08:07:25.168217 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:25.167737 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:25.190163 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:25.190106 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" podStartSLOduration=3.190091508 podStartE2EDuration="3.190091508s" podCreationTimestamp="2026-04-17 08:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:07:25.187073673 +0000 UTC m=+948.974852311" watchObservedRunningTime="2026-04-17 08:07:25.190091508 +0000 UTC m=+948.977870107" Apr 17 08:07:32.546088 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:32.546043 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:32.546088 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:32.546085 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:32.548774 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:32.548748 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:33.196147 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:33.196115 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:07:54.199716 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:07:54.199622 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:09:21.667519 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:21.667484 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v"] Apr 17 08:09:21.668059 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:21.667836 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerName="main" containerID="cri-o://bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718" gracePeriod=30 Apr 17 08:09:21.668059 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:21.667983 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerName="tokenizer" containerID="cri-o://5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38" gracePeriod=30 Apr 17 08:09:22.590617 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:22.590583 2566 generic.go:358] "Generic (PLEG): container finished" podID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerID="bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718" exitCode=0 Apr 17 08:09:22.590803 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:22.590666 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" event={"ID":"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d","Type":"ContainerDied","Data":"bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718"} Apr 17 08:09:23.033906 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.033884 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:09:23.107469 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.107432 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-cache\") pod \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " Apr 17 08:09:23.107660 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.107499 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-uds\") pod \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " Apr 17 08:09:23.107660 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.107534 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xg59\" (UniqueName: \"kubernetes.io/projected/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kube-api-access-9xg59\") pod \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " Apr 17 08:09:23.107660 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.107576 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kserve-provision-location\") pod \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " Apr 17 08:09:23.107876 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.107721 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-tmp\") pod \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " Apr 17 08:09:23.107876 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.107776 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tls-certs\") pod \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\" (UID: \"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d\") " Apr 17 08:09:23.107876 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.107782 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" (UID: "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:09:23.107876 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.107807 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" (UID: "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:09:23.108091 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.108059 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" (UID: "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:09:23.108136 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.108112 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-uds\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:09:23.108180 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.108134 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-cache\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:09:23.108356 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.108337 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" (UID: "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:09:23.109731 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.109680 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" (UID: "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:09:23.109831 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.109792 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kube-api-access-9xg59" (OuterVolumeSpecName: "kube-api-access-9xg59") pod "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" (UID: "ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d"). InnerVolumeSpecName "kube-api-access-9xg59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:09:23.208797 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.208690 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tokenizer-tmp\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:09:23.208797 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.208742 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-tls-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:09:23.208797 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.208752 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9xg59\" (UniqueName: \"kubernetes.io/projected/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kube-api-access-9xg59\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:09:23.208797 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.208760 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:09:23.597240 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.597207 2566 generic.go:358] "Generic (PLEG): container finished" podID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerID="5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38" exitCode=0 Apr 17 08:09:23.597395 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.597282 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" Apr 17 08:09:23.597395 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.597296 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" event={"ID":"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d","Type":"ContainerDied","Data":"5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38"} Apr 17 08:09:23.597395 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.597336 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v" event={"ID":"ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d","Type":"ContainerDied","Data":"fc258429afeaa7f95de9b9e2504ebbfba0fa54fa390a68dec2403987064cb59c"} Apr 17 08:09:23.597395 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.597352 2566 scope.go:117] "RemoveContainer" containerID="5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38" Apr 17 08:09:23.608572 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.608539 2566 scope.go:117] "RemoveContainer" containerID="bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718" Apr 17 08:09:23.616514 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.616490 2566 scope.go:117] "RemoveContainer" containerID="6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607" Apr 17 08:09:23.619438 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.619414 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v"] Apr 17 08:09:23.624742 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.624721 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-t7g5v"] Apr 17 08:09:23.625877 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.625862 2566 scope.go:117] "RemoveContainer" containerID="5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38" Apr 17 08:09:23.626138 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:09:23.626114 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38\": container with ID starting with 5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38 not found: ID does not exist" containerID="5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38" Apr 17 08:09:23.626189 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.626147 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38"} err="failed to get container status \"5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38\": rpc error: code = NotFound desc = could not find container \"5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38\": container with ID starting with 5f075722426d6f2b0955667b7674b81022eafded924a8154c7d09ebce27a4c38 not found: ID does not exist" Apr 17 08:09:23.626189 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.626166 2566 scope.go:117] "RemoveContainer" containerID="bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718" Apr 17 08:09:23.626409 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:09:23.626385 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718\": container with ID starting with bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718 not found: ID does not exist" containerID="bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718" Apr 17 08:09:23.626474 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.626420 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718"} err="failed to get container status \"bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718\": rpc error: code = NotFound desc = could not find container \"bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718\": container with ID starting with bc52bb9b497a3ecc6924d6dc87e9fd2e76c3b7748c910c26a062df741b86e718 not found: ID does not exist" Apr 17 08:09:23.626474 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.626444 2566 scope.go:117] "RemoveContainer" containerID="6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607" Apr 17 08:09:23.626681 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:09:23.626663 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607\": container with ID starting with 6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607 not found: ID does not exist" containerID="6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607" Apr 17 08:09:23.626742 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:23.626712 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607"} err="failed to get container status \"6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607\": rpc error: code = NotFound desc = could not find container \"6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607\": container with ID starting with 6308339a3df2d1a3b240b5f301bfc74e1888ecbfbbb0e0695de9f1ee62cdb607 not found: ID does not exist" Apr 17 08:09:24.807996 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:24.807962 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" path="/var/lib/kubelet/pods/ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d/volumes" Apr 17 08:09:40.020840 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.020806 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g"] Apr 17 08:09:40.021255 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.021226 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerName="tokenizer" Apr 17 08:09:40.021255 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.021243 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerName="tokenizer" Apr 17 08:09:40.021353 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.021289 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerName="storage-initializer" Apr 17 08:09:40.021353 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.021296 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerName="storage-initializer" Apr 17 08:09:40.021353 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.021305 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerName="main" Apr 17 08:09:40.021353 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.021312 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerName="main" Apr 17 08:09:40.021544 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.021407 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerName="main" Apr 17 08:09:40.021544 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.021419 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ead8cd33-bbc9-4f9a-ad0b-caba8e456a7d" containerName="tokenizer" Apr 17 08:09:40.026625 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.026601 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.029316 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.029293 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 17 08:09:40.029453 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.029293 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:09:40.030109 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.030090 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-rkhx8\"" Apr 17 08:09:40.033567 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.033517 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g"] Apr 17 08:09:40.066599 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.066559 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.066599 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.066596 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.066891 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.066691 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7875\" (UniqueName: \"kubernetes.io/projected/6734b3b2-5691-4df0-9783-307d76baa192-kube-api-access-f7875\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.066891 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.066777 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6734b3b2-5691-4df0-9783-307d76baa192-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.066891 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.066806 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.066891 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.066833 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.168258 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.168214 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7875\" (UniqueName: \"kubernetes.io/projected/6734b3b2-5691-4df0-9783-307d76baa192-kube-api-access-f7875\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.168439 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.168276 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6734b3b2-5691-4df0-9783-307d76baa192-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.168439 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.168301 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.168439 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.168334 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.168439 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.168432 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.168639 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.168459 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.168835 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.168806 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.168933 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.168830 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.168933 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.168864 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.168933 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.168873 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.170928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.170906 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6734b3b2-5691-4df0-9783-307d76baa192-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.176051 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.176026 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7875\" (UniqueName: \"kubernetes.io/projected/6734b3b2-5691-4df0-9783-307d76baa192-kube-api-access-f7875\") pod \"stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.338341 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.338303 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:40.470114 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.469984 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g"] Apr 17 08:09:40.472543 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:09:40.472492 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6734b3b2_5691_4df0_9783_307d76baa192.slice/crio-7dbe9c5ff6dbf4edbb9cc67fecbd7fd297602970d1dae58f3415ff6dea60363a WatchSource:0}: Error finding container 7dbe9c5ff6dbf4edbb9cc67fecbd7fd297602970d1dae58f3415ff6dea60363a: Status 404 returned error can't find the container with id 7dbe9c5ff6dbf4edbb9cc67fecbd7fd297602970d1dae58f3415ff6dea60363a Apr 17 08:09:40.474739 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.474723 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:09:40.659329 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.659242 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" event={"ID":"6734b3b2-5691-4df0-9783-307d76baa192","Type":"ContainerStarted","Data":"1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62"} Apr 17 08:09:40.659329 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:40.659281 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" event={"ID":"6734b3b2-5691-4df0-9783-307d76baa192","Type":"ContainerStarted","Data":"7dbe9c5ff6dbf4edbb9cc67fecbd7fd297602970d1dae58f3415ff6dea60363a"} Apr 17 08:09:41.664152 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:41.664118 2566 generic.go:358] "Generic (PLEG): container finished" podID="6734b3b2-5691-4df0-9783-307d76baa192" containerID="1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62" exitCode=0 Apr 17 08:09:41.664531 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:41.664201 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" event={"ID":"6734b3b2-5691-4df0-9783-307d76baa192","Type":"ContainerDied","Data":"1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62"} Apr 17 08:09:42.670896 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:42.670863 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" event={"ID":"6734b3b2-5691-4df0-9783-307d76baa192","Type":"ContainerStarted","Data":"7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa"} Apr 17 08:09:42.670896 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:42.670897 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" event={"ID":"6734b3b2-5691-4df0-9783-307d76baa192","Type":"ContainerStarted","Data":"b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa"} Apr 17 08:09:42.671420 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:42.671005 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:42.690251 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:42.690200 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" podStartSLOduration=2.690183815 podStartE2EDuration="2.690183815s" podCreationTimestamp="2026-04-17 08:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:09:42.689330922 +0000 UTC m=+1086.477109537" watchObservedRunningTime="2026-04-17 08:09:42.690183815 +0000 UTC m=+1086.477962417" Apr 17 08:09:50.339142 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:50.339105 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:50.339142 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:50.339141 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:50.341741 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:50.341715 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:09:50.701626 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:09:50.701546 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:10:11.705092 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:10:11.705057 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:11:31.217888 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:31.217852 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g"] Apr 17 08:11:31.218878 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:31.218804 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="main" containerID="cri-o://b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa" gracePeriod=30 Apr 17 08:11:31.218878 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:31.218846 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="tokenizer" containerID="cri-o://7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa" gracePeriod=30 Apr 17 08:11:31.704777 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:11:31.704745 2566 logging.go:55] [core] [Channel #180 SubChannel #181]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.45:9003", ServerName: "10.132.0.45:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.45:9003: connect: connection refused" Apr 17 08:11:32.068065 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.068031 2566 generic.go:358] "Generic (PLEG): container finished" podID="6734b3b2-5691-4df0-9783-307d76baa192" containerID="b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa" exitCode=0 Apr 17 08:11:32.068279 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.068106 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" event={"ID":"6734b3b2-5691-4df0-9783-307d76baa192","Type":"ContainerDied","Data":"b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa"} Apr 17 08:11:32.578109 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.578075 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:11:32.645847 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.645755 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-tmp\") pod \"6734b3b2-5691-4df0-9783-307d76baa192\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " Apr 17 08:11:32.645847 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.645800 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-uds\") pod \"6734b3b2-5691-4df0-9783-307d76baa192\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " Apr 17 08:11:32.646051 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.645855 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-cache\") pod \"6734b3b2-5691-4df0-9783-307d76baa192\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " Apr 17 08:11:32.646051 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.645894 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6734b3b2-5691-4df0-9783-307d76baa192-tls-certs\") pod \"6734b3b2-5691-4df0-9783-307d76baa192\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " Apr 17 08:11:32.646051 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.645921 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7875\" (UniqueName: \"kubernetes.io/projected/6734b3b2-5691-4df0-9783-307d76baa192-kube-api-access-f7875\") pod \"6734b3b2-5691-4df0-9783-307d76baa192\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " Apr 17 08:11:32.646051 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.645961 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-kserve-provision-location\") pod \"6734b3b2-5691-4df0-9783-307d76baa192\" (UID: \"6734b3b2-5691-4df0-9783-307d76baa192\") " Apr 17 08:11:32.646255 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.646100 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6734b3b2-5691-4df0-9783-307d76baa192" (UID: "6734b3b2-5691-4df0-9783-307d76baa192"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:11:32.646255 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.646178 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6734b3b2-5691-4df0-9783-307d76baa192" (UID: "6734b3b2-5691-4df0-9783-307d76baa192"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:11:32.646255 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.646199 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6734b3b2-5691-4df0-9783-307d76baa192" (UID: "6734b3b2-5691-4df0-9783-307d76baa192"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:11:32.646425 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.646335 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-tmp\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:11:32.646425 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.646349 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-uds\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:11:32.646425 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.646358 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-tokenizer-cache\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:11:32.646830 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.646805 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6734b3b2-5691-4df0-9783-307d76baa192" (UID: "6734b3b2-5691-4df0-9783-307d76baa192"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:11:32.648079 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.648049 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6734b3b2-5691-4df0-9783-307d76baa192-kube-api-access-f7875" (OuterVolumeSpecName: "kube-api-access-f7875") pod "6734b3b2-5691-4df0-9783-307d76baa192" (UID: "6734b3b2-5691-4df0-9783-307d76baa192"). InnerVolumeSpecName "kube-api-access-f7875". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:11:32.648185 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.648100 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6734b3b2-5691-4df0-9783-307d76baa192-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6734b3b2-5691-4df0-9783-307d76baa192" (UID: "6734b3b2-5691-4df0-9783-307d76baa192"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:11:32.705242 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.705200 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.45:9003\" within 1s: context deadline exceeded" Apr 17 08:11:32.747543 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.747509 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6734b3b2-5691-4df0-9783-307d76baa192-tls-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:11:32.747543 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.747539 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f7875\" (UniqueName: \"kubernetes.io/projected/6734b3b2-5691-4df0-9783-307d76baa192-kube-api-access-f7875\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:11:32.747543 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:32.747549 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6734b3b2-5691-4df0-9783-307d76baa192-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:11:33.073189 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.073149 2566 generic.go:358] "Generic (PLEG): container finished" podID="6734b3b2-5691-4df0-9783-307d76baa192" containerID="7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa" exitCode=0 Apr 17 08:11:33.073381 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.073231 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" event={"ID":"6734b3b2-5691-4df0-9783-307d76baa192","Type":"ContainerDied","Data":"7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa"} Apr 17 08:11:33.073381 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.073267 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" event={"ID":"6734b3b2-5691-4df0-9783-307d76baa192","Type":"ContainerDied","Data":"7dbe9c5ff6dbf4edbb9cc67fecbd7fd297602970d1dae58f3415ff6dea60363a"} Apr 17 08:11:33.073381 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.073237 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g" Apr 17 08:11:33.073381 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.073334 2566 scope.go:117] "RemoveContainer" containerID="7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa" Apr 17 08:11:33.081669 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.081570 2566 scope.go:117] "RemoveContainer" containerID="b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa" Apr 17 08:11:33.089574 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.089549 2566 scope.go:117] "RemoveContainer" containerID="1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62" Apr 17 08:11:33.092284 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.092258 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g"] Apr 17 08:11:33.094077 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.094054 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-d95dd49fb-zs84g"] Apr 17 08:11:33.098402 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.098386 2566 scope.go:117] "RemoveContainer" containerID="7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa" Apr 17 08:11:33.098659 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:11:33.098640 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa\": container with ID starting with 7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa not found: ID does not exist" containerID="7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa" Apr 17 08:11:33.098725 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.098667 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa"} err="failed to get container status \"7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa\": rpc error: code = NotFound desc = could not find container \"7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa\": container with ID starting with 7861d6d2cc5c01ef4f10cce24c62a173a34e1093f62f994a92f22ea988d44ffa not found: ID does not exist" Apr 17 08:11:33.098725 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.098685 2566 scope.go:117] "RemoveContainer" containerID="b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa" Apr 17 08:11:33.098922 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:11:33.098907 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa\": container with ID starting with b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa not found: ID does not exist" containerID="b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa" Apr 17 08:11:33.098962 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.098927 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa"} err="failed to get container status \"b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa\": rpc error: code = NotFound desc = could not find container \"b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa\": container with ID starting with b80bc6aae0ccb781ef8454613db85a9b4159eb0b60a9b56d757b0c22381d74aa not found: ID does not exist" Apr 17 08:11:33.098962 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.098943 2566 scope.go:117] "RemoveContainer" containerID="1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62" Apr 17 08:11:33.099205 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:11:33.099186 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62\": container with ID starting with 1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62 not found: ID does not exist" containerID="1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62" Apr 17 08:11:33.099247 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.099210 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62"} err="failed to get container status \"1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62\": rpc error: code = NotFound desc = could not find container \"1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62\": container with ID starting with 1a2f715e5bef5a067dc837b5395dc5e9a0739ea9834d422af6677181fa50dd62 not found: ID does not exist" Apr 17 08:11:33.334203 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.334119 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-bbd45769b-sbw58"] Apr 17 08:11:33.334514 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.334502 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="tokenizer" Apr 17 08:11:33.334558 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.334516 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="tokenizer" Apr 17 08:11:33.334558 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.334529 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="main" Apr 17 08:11:33.334558 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.334535 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="main" Apr 17 08:11:33.334558 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.334541 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="storage-initializer" Apr 17 08:11:33.334558 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.334547 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="storage-initializer" Apr 17 08:11:33.334768 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.334606 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="main" Apr 17 08:11:33.334768 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.334613 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6734b3b2-5691-4df0-9783-307d76baa192" containerName="tokenizer" Apr 17 08:11:33.339206 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.339184 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" Apr 17 08:11:33.344314 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.344289 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-bbd45769b-sbw58"] Apr 17 08:11:33.353453 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.353429 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hddjb\" (UniqueName: \"kubernetes.io/projected/e452f4ee-4679-46e5-9c7d-0e11885c9419-kube-api-access-hddjb\") pod \"llmisvc-controller-manager-bbd45769b-sbw58\" (UID: \"e452f4ee-4679-46e5-9c7d-0e11885c9419\") " pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" Apr 17 08:11:33.353580 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.353470 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e452f4ee-4679-46e5-9c7d-0e11885c9419-cert\") pod \"llmisvc-controller-manager-bbd45769b-sbw58\" (UID: \"e452f4ee-4679-46e5-9c7d-0e11885c9419\") " pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" Apr 17 08:11:33.454784 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.454751 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hddjb\" (UniqueName: \"kubernetes.io/projected/e452f4ee-4679-46e5-9c7d-0e11885c9419-kube-api-access-hddjb\") pod \"llmisvc-controller-manager-bbd45769b-sbw58\" (UID: \"e452f4ee-4679-46e5-9c7d-0e11885c9419\") " pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" Apr 17 08:11:33.454938 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.454806 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e452f4ee-4679-46e5-9c7d-0e11885c9419-cert\") pod \"llmisvc-controller-manager-bbd45769b-sbw58\" (UID: \"e452f4ee-4679-46e5-9c7d-0e11885c9419\") " pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" Apr 17 08:11:33.457101 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.457070 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e452f4ee-4679-46e5-9c7d-0e11885c9419-cert\") pod \"llmisvc-controller-manager-bbd45769b-sbw58\" (UID: \"e452f4ee-4679-46e5-9c7d-0e11885c9419\") " pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" Apr 17 08:11:33.462662 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.462645 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hddjb\" (UniqueName: \"kubernetes.io/projected/e452f4ee-4679-46e5-9c7d-0e11885c9419-kube-api-access-hddjb\") pod \"llmisvc-controller-manager-bbd45769b-sbw58\" (UID: \"e452f4ee-4679-46e5-9c7d-0e11885c9419\") " pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" Apr 17 08:11:33.651445 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.651349 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" Apr 17 08:11:33.773493 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:33.773345 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-bbd45769b-sbw58"] Apr 17 08:11:33.775863 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:11:33.775830 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode452f4ee_4679_46e5_9c7d_0e11885c9419.slice/crio-741b3a5d9f3d7d3d3c8019c93e78d8ca58daa66cf7a5c399f1a2a443726849d4 WatchSource:0}: Error finding container 741b3a5d9f3d7d3d3c8019c93e78d8ca58daa66cf7a5c399f1a2a443726849d4: Status 404 returned error can't find the container with id 741b3a5d9f3d7d3d3c8019c93e78d8ca58daa66cf7a5c399f1a2a443726849d4 Apr 17 08:11:34.077866 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:34.077828 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" event={"ID":"e452f4ee-4679-46e5-9c7d-0e11885c9419","Type":"ContainerStarted","Data":"741b3a5d9f3d7d3d3c8019c93e78d8ca58daa66cf7a5c399f1a2a443726849d4"} Apr 17 08:11:34.807452 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:34.807420 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6734b3b2-5691-4df0-9783-307d76baa192" path="/var/lib/kubelet/pods/6734b3b2-5691-4df0-9783-307d76baa192/volumes" Apr 17 08:11:35.083334 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:35.083236 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" event={"ID":"e452f4ee-4679-46e5-9c7d-0e11885c9419","Type":"ContainerStarted","Data":"b4635a458fb89428c3e698cbaf2b3c0686f1e9ddebcc7ea8b99ec3ea73690cca"} Apr 17 08:11:35.083334 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:35.083286 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" Apr 17 08:11:35.103824 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:35.103768 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" podStartSLOduration=1.577324721 podStartE2EDuration="2.103752942s" podCreationTimestamp="2026-04-17 08:11:33 +0000 UTC" firstStartedPulling="2026-04-17 08:11:33.77719041 +0000 UTC m=+1197.564968994" lastFinishedPulling="2026-04-17 08:11:34.303618631 +0000 UTC m=+1198.091397215" observedRunningTime="2026-04-17 08:11:35.102149913 +0000 UTC m=+1198.889928517" watchObservedRunningTime="2026-04-17 08:11:35.103752942 +0000 UTC m=+1198.891531545" Apr 17 08:11:36.797501 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:36.797471 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:11:36.801390 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:36.801367 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:11:36.802523 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:36.802500 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:11:36.807442 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:11:36.807425 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:12:06.089010 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:06.088980 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-bbd45769b-sbw58" Apr 17 08:12:06.137662 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:06.137628 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-847974b58-t4k6h"] Apr 17 08:12:06.138018 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:06.137984 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" podUID="0f96c13c-f21f-4805-8661-ab8ee2be0169" containerName="manager" containerID="cri-o://700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4" gracePeriod=30 Apr 17 08:12:09.586397 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:09.586374 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:12:09.598710 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:09.598673 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f96c13c-f21f-4805-8661-ab8ee2be0169-cert\") pod \"0f96c13c-f21f-4805-8661-ab8ee2be0169\" (UID: \"0f96c13c-f21f-4805-8661-ab8ee2be0169\") " Apr 17 08:12:09.598859 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:09.598725 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prq6j\" (UniqueName: \"kubernetes.io/projected/0f96c13c-f21f-4805-8661-ab8ee2be0169-kube-api-access-prq6j\") pod \"0f96c13c-f21f-4805-8661-ab8ee2be0169\" (UID: \"0f96c13c-f21f-4805-8661-ab8ee2be0169\") " Apr 17 08:12:09.600802 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:09.600765 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f96c13c-f21f-4805-8661-ab8ee2be0169-cert" (OuterVolumeSpecName: "cert") pod "0f96c13c-f21f-4805-8661-ab8ee2be0169" (UID: "0f96c13c-f21f-4805-8661-ab8ee2be0169"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:12:09.600983 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:09.600948 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f96c13c-f21f-4805-8661-ab8ee2be0169-kube-api-access-prq6j" (OuterVolumeSpecName: "kube-api-access-prq6j") pod "0f96c13c-f21f-4805-8661-ab8ee2be0169" (UID: "0f96c13c-f21f-4805-8661-ab8ee2be0169"). InnerVolumeSpecName "kube-api-access-prq6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:12:09.699797 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:09.699692 2566 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f96c13c-f21f-4805-8661-ab8ee2be0169-cert\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:12:09.699797 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:09.699744 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-prq6j\" (UniqueName: \"kubernetes.io/projected/0f96c13c-f21f-4805-8661-ab8ee2be0169-kube-api-access-prq6j\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:12:10.211823 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:10.211789 2566 generic.go:358] "Generic (PLEG): container finished" podID="0f96c13c-f21f-4805-8661-ab8ee2be0169" containerID="700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4" exitCode=0 Apr 17 08:12:10.212033 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:10.211877 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" Apr 17 08:12:10.212033 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:10.211875 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" event={"ID":"0f96c13c-f21f-4805-8661-ab8ee2be0169","Type":"ContainerDied","Data":"700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4"} Apr 17 08:12:10.212033 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:10.211989 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-847974b58-t4k6h" event={"ID":"0f96c13c-f21f-4805-8661-ab8ee2be0169","Type":"ContainerDied","Data":"b11036dabc6dc16743fbf0866da25d8df92b788dfddaa7b62ebb821289a8b933"} Apr 17 08:12:10.212033 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:10.212010 2566 scope.go:117] "RemoveContainer" containerID="700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4" Apr 17 08:12:10.221059 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:10.221044 2566 scope.go:117] "RemoveContainer" containerID="700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4" Apr 17 08:12:10.221315 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:12:10.221297 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4\": container with ID starting with 700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4 not found: ID does not exist" containerID="700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4" Apr 17 08:12:10.221376 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:10.221328 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4"} err="failed to get container status \"700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4\": rpc error: code = NotFound desc = could not find container \"700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4\": container with ID starting with 700e355edb7d09abf78e7853a6309cd88d97ae6bc2dac125a1843e5e0c942de4 not found: ID does not exist" Apr 17 08:12:10.233152 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:10.233128 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-847974b58-t4k6h"] Apr 17 08:12:10.237153 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:10.237128 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-847974b58-t4k6h"] Apr 17 08:12:10.813195 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:10.813160 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f96c13c-f21f-4805-8661-ab8ee2be0169" path="/var/lib/kubelet/pods/0f96c13c-f21f-4805-8661-ab8ee2be0169/volumes" Apr 17 08:12:43.519964 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.519929 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb"] Apr 17 08:12:43.520627 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.520598 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f96c13c-f21f-4805-8661-ab8ee2be0169" containerName="manager" Apr 17 08:12:43.520627 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.520628 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f96c13c-f21f-4805-8661-ab8ee2be0169" containerName="manager" Apr 17 08:12:43.520866 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.520762 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f96c13c-f21f-4805-8661-ab8ee2be0169" containerName="manager" Apr 17 08:12:43.524289 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.524271 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.526920 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.526896 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-wjppg\"" Apr 17 08:12:43.527857 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.527834 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:12:43.527975 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.527842 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 17 08:12:43.535493 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.535468 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb"] Apr 17 08:12:43.612322 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.612281 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.612322 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.612330 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.612584 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.612352 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.612584 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.612435 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.612584 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.612479 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.612584 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.612552 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q52gv\" (UniqueName: \"kubernetes.io/projected/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kube-api-access-q52gv\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.713337 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.713296 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.713525 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.713341 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.713525 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.713478 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.713644 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.713528 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.713644 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.713610 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q52gv\" (UniqueName: \"kubernetes.io/projected/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kube-api-access-q52gv\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.713802 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.713689 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.713862 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.713828 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.713918 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.713873 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.713958 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.713927 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.713997 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.713968 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.716165 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.716147 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.721333 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.721312 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q52gv\" (UniqueName: \"kubernetes.io/projected/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kube-api-access-q52gv\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.834984 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.834944 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:43.973796 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:43.973765 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb"] Apr 17 08:12:43.975734 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:12:43.975682 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc5a7693_bcbb_4fce_b764_a32fd851a77c.slice/crio-5f0ce2adcef217db03409e3f50508fe67bfbd84447303127a363152eef73e7a4 WatchSource:0}: Error finding container 5f0ce2adcef217db03409e3f50508fe67bfbd84447303127a363152eef73e7a4: Status 404 returned error can't find the container with id 5f0ce2adcef217db03409e3f50508fe67bfbd84447303127a363152eef73e7a4 Apr 17 08:12:44.341366 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:44.341318 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" event={"ID":"cc5a7693-bcbb-4fce-b764-a32fd851a77c","Type":"ContainerStarted","Data":"885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467"} Apr 17 08:12:44.341366 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:44.341373 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" event={"ID":"cc5a7693-bcbb-4fce-b764-a32fd851a77c","Type":"ContainerStarted","Data":"5f0ce2adcef217db03409e3f50508fe67bfbd84447303127a363152eef73e7a4"} Apr 17 08:12:45.347326 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:45.347272 2566 generic.go:358] "Generic (PLEG): container finished" podID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerID="885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467" exitCode=0 Apr 17 08:12:45.347840 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:45.347352 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" event={"ID":"cc5a7693-bcbb-4fce-b764-a32fd851a77c","Type":"ContainerDied","Data":"885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467"} Apr 17 08:12:46.354138 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:46.354100 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" event={"ID":"cc5a7693-bcbb-4fce-b764-a32fd851a77c","Type":"ContainerStarted","Data":"addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3"} Apr 17 08:12:46.354138 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:46.354137 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" event={"ID":"cc5a7693-bcbb-4fce-b764-a32fd851a77c","Type":"ContainerStarted","Data":"e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982"} Apr 17 08:12:46.354667 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:46.354231 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:46.376616 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:46.376561 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" podStartSLOduration=3.376547892 podStartE2EDuration="3.376547892s" podCreationTimestamp="2026-04-17 08:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:12:46.375308714 +0000 UTC m=+1270.163087318" watchObservedRunningTime="2026-04-17 08:12:46.376547892 +0000 UTC m=+1270.164326513" Apr 17 08:12:53.835479 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:53.835440 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:53.835479 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:53.835488 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:53.838499 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:53.838470 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:12:54.385306 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:12:54.385277 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:13:15.389174 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:13:15.389134 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:16:09.404800 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:09.404753 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb"] Apr 17 08:16:09.405396 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:09.405268 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerName="main" containerID="cri-o://e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982" gracePeriod=30 Apr 17 08:16:09.405756 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:09.405725 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerName="tokenizer" containerID="cri-o://addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3" gracePeriod=30 Apr 17 08:16:10.132524 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.132490 2566 generic.go:358] "Generic (PLEG): container finished" podID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerID="e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982" exitCode=0 Apr 17 08:16:10.132759 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.132565 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" event={"ID":"cc5a7693-bcbb-4fce-b764-a32fd851a77c","Type":"ContainerDied","Data":"e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982"} Apr 17 08:16:10.658376 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.658351 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:16:10.714885 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.714855 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-tmp\") pod \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " Apr 17 08:16:10.715085 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.714912 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kserve-provision-location\") pod \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " Apr 17 08:16:10.715085 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.714952 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-uds\") pod \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " Apr 17 08:16:10.715085 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.714979 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-cache\") pod \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " Apr 17 08:16:10.715085 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.715014 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tls-certs\") pod \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " Apr 17 08:16:10.715324 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.715112 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q52gv\" (UniqueName: \"kubernetes.io/projected/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kube-api-access-q52gv\") pod \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\" (UID: \"cc5a7693-bcbb-4fce-b764-a32fd851a77c\") " Apr 17 08:16:10.715324 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.715259 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "cc5a7693-bcbb-4fce-b764-a32fd851a77c" (UID: "cc5a7693-bcbb-4fce-b764-a32fd851a77c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:10.715324 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.715266 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "cc5a7693-bcbb-4fce-b764-a32fd851a77c" (UID: "cc5a7693-bcbb-4fce-b764-a32fd851a77c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:10.715324 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.715282 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "cc5a7693-bcbb-4fce-b764-a32fd851a77c" (UID: "cc5a7693-bcbb-4fce-b764-a32fd851a77c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:10.715579 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.715474 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-tmp\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:16:10.715579 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.715491 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-uds\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:16:10.715579 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.715500 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tokenizer-cache\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:16:10.715679 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.715649 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cc5a7693-bcbb-4fce-b764-a32fd851a77c" (UID: "cc5a7693-bcbb-4fce-b764-a32fd851a77c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:16:10.717180 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.717159 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cc5a7693-bcbb-4fce-b764-a32fd851a77c" (UID: "cc5a7693-bcbb-4fce-b764-a32fd851a77c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:16:10.717270 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.717251 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kube-api-access-q52gv" (OuterVolumeSpecName: "kube-api-access-q52gv") pod "cc5a7693-bcbb-4fce-b764-a32fd851a77c" (UID: "cc5a7693-bcbb-4fce-b764-a32fd851a77c"). InnerVolumeSpecName "kube-api-access-q52gv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:16:10.816021 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.815996 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q52gv\" (UniqueName: \"kubernetes.io/projected/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kube-api-access-q52gv\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:16:10.816134 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.816024 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc5a7693-bcbb-4fce-b764-a32fd851a77c-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:16:10.816134 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:10.816037 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc5a7693-bcbb-4fce-b764-a32fd851a77c-tls-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:16:11.139158 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.139064 2566 generic.go:358] "Generic (PLEG): container finished" podID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerID="addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3" exitCode=0 Apr 17 08:16:11.139158 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.139150 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" Apr 17 08:16:11.139390 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.139149 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" event={"ID":"cc5a7693-bcbb-4fce-b764-a32fd851a77c","Type":"ContainerDied","Data":"addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3"} Apr 17 08:16:11.139390 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.139193 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb" event={"ID":"cc5a7693-bcbb-4fce-b764-a32fd851a77c","Type":"ContainerDied","Data":"5f0ce2adcef217db03409e3f50508fe67bfbd84447303127a363152eef73e7a4"} Apr 17 08:16:11.139390 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.139217 2566 scope.go:117] "RemoveContainer" containerID="addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3" Apr 17 08:16:11.147444 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.147426 2566 scope.go:117] "RemoveContainer" containerID="e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982" Apr 17 08:16:11.154770 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.154752 2566 scope.go:117] "RemoveContainer" containerID="885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467" Apr 17 08:16:11.159205 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.159167 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb"] Apr 17 08:16:11.163744 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.163724 2566 scope.go:117] "RemoveContainer" containerID="addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3" Apr 17 08:16:11.164029 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:16:11.164001 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3\": container with ID starting with addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3 not found: ID does not exist" containerID="addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3" Apr 17 08:16:11.164120 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.164039 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3"} err="failed to get container status \"addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3\": rpc error: code = NotFound desc = could not find container \"addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3\": container with ID starting with addd49edf533795ac1d5fb5677e110c85125f8f86aea824b09e507c3f0daa2a3 not found: ID does not exist" Apr 17 08:16:11.164120 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.164066 2566 scope.go:117] "RemoveContainer" containerID="e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982" Apr 17 08:16:11.164223 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.164209 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewxjwb"] Apr 17 08:16:11.164334 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:16:11.164315 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982\": container with ID starting with e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982 not found: ID does not exist" containerID="e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982" Apr 17 08:16:11.164399 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.164353 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982"} err="failed to get container status \"e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982\": rpc error: code = NotFound desc = could not find container \"e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982\": container with ID starting with e117448b15082410f13dfc618ff4a252b6f20e6cbca50f04e78286eac5aba982 not found: ID does not exist" Apr 17 08:16:11.164399 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.164378 2566 scope.go:117] "RemoveContainer" containerID="885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467" Apr 17 08:16:11.164602 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:16:11.164586 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467\": container with ID starting with 885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467 not found: ID does not exist" containerID="885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467" Apr 17 08:16:11.164643 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:11.164609 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467"} err="failed to get container status \"885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467\": rpc error: code = NotFound desc = could not find container \"885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467\": container with ID starting with 885ce52ec00215e88ea9ac759ba28d751fbbd37a5f3fced0324be1ffa06fd467 not found: ID does not exist" Apr 17 08:16:12.812903 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:12.812865 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" path="/var/lib/kubelet/pods/cc5a7693-bcbb-4fce-b764-a32fd851a77c/volumes" Apr 17 08:16:35.644916 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.644879 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm"] Apr 17 08:16:35.645586 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.645563 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerName="tokenizer" Apr 17 08:16:35.645708 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.645588 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerName="tokenizer" Apr 17 08:16:35.645708 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.645608 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerName="storage-initializer" Apr 17 08:16:35.645708 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.645618 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerName="storage-initializer" Apr 17 08:16:35.645708 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.645639 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerName="main" Apr 17 08:16:35.645708 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.645647 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerName="main" Apr 17 08:16:35.645979 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.645798 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerName="main" Apr 17 08:16:35.645979 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.645816 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc5a7693-bcbb-4fce-b764-a32fd851a77c" containerName="tokenizer" Apr 17 08:16:35.649295 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.649272 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.653215 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.653194 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 17 08:16:35.653344 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.653241 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:16:35.653344 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.653293 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-q7mjl\"" Apr 17 08:16:35.659679 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.659657 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm"] Apr 17 08:16:35.756357 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.756323 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7nlq\" (UniqueName: \"kubernetes.io/projected/6fe8610a-b55b-42e7-96cf-45b55731bd30-kube-api-access-k7nlq\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.756538 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.756364 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe8610a-b55b-42e7-96cf-45b55731bd30-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.756538 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.756477 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.756538 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.756503 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.756538 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.756520 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.756723 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.756623 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.857376 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.857337 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.857578 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.857384 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.857578 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.857403 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.857578 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.857461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.857578 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.857482 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7nlq\" (UniqueName: \"kubernetes.io/projected/6fe8610a-b55b-42e7-96cf-45b55731bd30-kube-api-access-k7nlq\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.857578 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.857500 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe8610a-b55b-42e7-96cf-45b55731bd30-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.857912 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.857877 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.858031 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.857917 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.858031 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.857949 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.858031 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.857978 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.860065 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.860047 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe8610a-b55b-42e7-96cf-45b55731bd30-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.865854 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.865832 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7nlq\" (UniqueName: \"kubernetes.io/projected/6fe8610a-b55b-42e7-96cf-45b55731bd30-kube-api-access-k7nlq\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:35.961943 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:35.961851 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:36.096152 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:36.096126 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm"] Apr 17 08:16:36.098431 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:16:36.098402 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fe8610a_b55b_42e7_96cf_45b55731bd30.slice/crio-76d623698153c0c5e67fa529d29e0a663b5822247e3a56de00d035c732fe61ac WatchSource:0}: Error finding container 76d623698153c0c5e67fa529d29e0a663b5822247e3a56de00d035c732fe61ac: Status 404 returned error can't find the container with id 76d623698153c0c5e67fa529d29e0a663b5822247e3a56de00d035c732fe61ac Apr 17 08:16:36.100819 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:36.100798 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:16:36.231314 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:36.231222 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" event={"ID":"6fe8610a-b55b-42e7-96cf-45b55731bd30","Type":"ContainerStarted","Data":"067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491"} Apr 17 08:16:36.231314 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:36.231264 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" event={"ID":"6fe8610a-b55b-42e7-96cf-45b55731bd30","Type":"ContainerStarted","Data":"76d623698153c0c5e67fa529d29e0a663b5822247e3a56de00d035c732fe61ac"} Apr 17 08:16:36.826025 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:36.825994 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:16:36.834928 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:36.834903 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:16:36.836218 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:36.836194 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:16:36.842759 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:36.842741 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:16:37.236306 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:37.236208 2566 generic.go:358] "Generic (PLEG): container finished" podID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerID="067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491" exitCode=0 Apr 17 08:16:37.236460 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:37.236292 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" event={"ID":"6fe8610a-b55b-42e7-96cf-45b55731bd30","Type":"ContainerDied","Data":"067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491"} Apr 17 08:16:38.242246 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:38.242210 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" event={"ID":"6fe8610a-b55b-42e7-96cf-45b55731bd30","Type":"ContainerStarted","Data":"21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e"} Apr 17 08:16:38.242246 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:38.242251 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" event={"ID":"6fe8610a-b55b-42e7-96cf-45b55731bd30","Type":"ContainerStarted","Data":"fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c"} Apr 17 08:16:38.242673 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:38.242387 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:38.263502 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:38.263450 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" podStartSLOduration=3.263430531 podStartE2EDuration="3.263430531s" podCreationTimestamp="2026-04-17 08:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:16:38.261344682 +0000 UTC m=+1502.049123285" watchObservedRunningTime="2026-04-17 08:16:38.263430531 +0000 UTC m=+1502.051209135" Apr 17 08:16:45.962462 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:45.962358 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:45.962462 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:45.962405 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:45.965322 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:45.965296 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:16:46.274333 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:16:46.274249 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:17:07.278396 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:17:07.278366 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:19:27.200385 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:27.200344 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm"] Apr 17 08:19:27.200984 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:27.200667 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="main" containerID="cri-o://fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c" gracePeriod=30 Apr 17 08:19:27.200984 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:27.200743 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="tokenizer" containerID="cri-o://21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e" gracePeriod=30 Apr 17 08:19:27.278393 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:19:27.278363 2566 logging.go:55] [core] [Channel #389 SubChannel #390]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.48:9003", ServerName: "10.132.0.48:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.48:9003: connect: connection refused" Apr 17 08:19:27.868727 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:27.868668 2566 generic.go:358] "Generic (PLEG): container finished" podID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerID="fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c" exitCode=0 Apr 17 08:19:27.868913 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:27.868746 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" event={"ID":"6fe8610a-b55b-42e7-96cf-45b55731bd30","Type":"ContainerDied","Data":"fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c"} Apr 17 08:19:28.278891 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.278834 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.48:9003\" within 1s: context deadline exceeded" Apr 17 08:19:28.767399 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.767373 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:19:28.819390 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.819356 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe8610a-b55b-42e7-96cf-45b55731bd30-tls-certs\") pod \"6fe8610a-b55b-42e7-96cf-45b55731bd30\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " Apr 17 08:19:28.819817 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.819398 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-cache\") pod \"6fe8610a-b55b-42e7-96cf-45b55731bd30\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " Apr 17 08:19:28.819817 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.819428 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-uds\") pod \"6fe8610a-b55b-42e7-96cf-45b55731bd30\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " Apr 17 08:19:28.819817 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.819455 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-tmp\") pod \"6fe8610a-b55b-42e7-96cf-45b55731bd30\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " Apr 17 08:19:28.819817 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.819489 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7nlq\" (UniqueName: \"kubernetes.io/projected/6fe8610a-b55b-42e7-96cf-45b55731bd30-kube-api-access-k7nlq\") pod \"6fe8610a-b55b-42e7-96cf-45b55731bd30\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " Apr 17 08:19:28.819817 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.819584 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-kserve-provision-location\") pod \"6fe8610a-b55b-42e7-96cf-45b55731bd30\" (UID: \"6fe8610a-b55b-42e7-96cf-45b55731bd30\") " Apr 17 08:19:28.819817 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.819746 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6fe8610a-b55b-42e7-96cf-45b55731bd30" (UID: "6fe8610a-b55b-42e7-96cf-45b55731bd30"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:28.819817 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.819838 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-cache\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:19:28.819817 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.819919 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6fe8610a-b55b-42e7-96cf-45b55731bd30" (UID: "6fe8610a-b55b-42e7-96cf-45b55731bd30"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:28.819817 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.820095 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6fe8610a-b55b-42e7-96cf-45b55731bd30" (UID: "6fe8610a-b55b-42e7-96cf-45b55731bd30"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:28.819817 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.820580 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6fe8610a-b55b-42e7-96cf-45b55731bd30" (UID: "6fe8610a-b55b-42e7-96cf-45b55731bd30"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:19:28.822162 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.822135 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe8610a-b55b-42e7-96cf-45b55731bd30-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6fe8610a-b55b-42e7-96cf-45b55731bd30" (UID: "6fe8610a-b55b-42e7-96cf-45b55731bd30"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:19:28.822282 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.822259 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe8610a-b55b-42e7-96cf-45b55731bd30-kube-api-access-k7nlq" (OuterVolumeSpecName: "kube-api-access-k7nlq") pod "6fe8610a-b55b-42e7-96cf-45b55731bd30" (UID: "6fe8610a-b55b-42e7-96cf-45b55731bd30"). InnerVolumeSpecName "kube-api-access-k7nlq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:19:28.874283 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.874195 2566 generic.go:358] "Generic (PLEG): container finished" podID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerID="21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e" exitCode=0 Apr 17 08:19:28.874283 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.874272 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" Apr 17 08:19:28.874480 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.874290 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" event={"ID":"6fe8610a-b55b-42e7-96cf-45b55731bd30","Type":"ContainerDied","Data":"21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e"} Apr 17 08:19:28.874480 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.874339 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm" event={"ID":"6fe8610a-b55b-42e7-96cf-45b55731bd30","Type":"ContainerDied","Data":"76d623698153c0c5e67fa529d29e0a663b5822247e3a56de00d035c732fe61ac"} Apr 17 08:19:28.874480 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.874363 2566 scope.go:117] "RemoveContainer" containerID="21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e" Apr 17 08:19:28.889621 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.889587 2566 scope.go:117] "RemoveContainer" containerID="fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c" Apr 17 08:19:28.898602 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.898149 2566 scope.go:117] "RemoveContainer" containerID="067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491" Apr 17 08:19:28.900968 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.900944 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm"] Apr 17 08:19:28.905446 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.905417 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-7f84cwxzkm"] Apr 17 08:19:28.908959 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.908233 2566 scope.go:117] "RemoveContainer" containerID="21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e" Apr 17 08:19:28.908959 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:19:28.908606 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e\": container with ID starting with 21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e not found: ID does not exist" containerID="21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e" Apr 17 08:19:28.908959 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.908640 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e"} err="failed to get container status \"21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e\": rpc error: code = NotFound desc = could not find container \"21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e\": container with ID starting with 21f056eb3f7c266b15806bc82df4c3309f1766343d25d6b6218cd15bd1bb7d2e not found: ID does not exist" Apr 17 08:19:28.908959 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.908679 2566 scope.go:117] "RemoveContainer" containerID="fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c" Apr 17 08:19:28.909213 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:19:28.908947 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c\": container with ID starting with fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c not found: ID does not exist" containerID="fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c" Apr 17 08:19:28.909213 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.908989 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c"} err="failed to get container status \"fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c\": rpc error: code = NotFound desc = could not find container \"fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c\": container with ID starting with fc58cf5b515cbc9fe6f61ceddedbd12b263cd77f38c22d009accc74a3097020c not found: ID does not exist" Apr 17 08:19:28.909213 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.909009 2566 scope.go:117] "RemoveContainer" containerID="067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491" Apr 17 08:19:28.909378 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:19:28.909263 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491\": container with ID starting with 067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491 not found: ID does not exist" containerID="067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491" Apr 17 08:19:28.909378 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.909329 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491"} err="failed to get container status \"067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491\": rpc error: code = NotFound desc = could not find container \"067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491\": container with ID starting with 067e01931c3447995a453395595c334cce02e1f9f14bd0fbb059533202e6b491 not found: ID does not exist" Apr 17 08:19:28.921107 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.921073 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k7nlq\" (UniqueName: \"kubernetes.io/projected/6fe8610a-b55b-42e7-96cf-45b55731bd30-kube-api-access-k7nlq\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:19:28.921107 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.921107 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:19:28.921294 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.921121 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe8610a-b55b-42e7-96cf-45b55731bd30-tls-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:19:28.921294 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.921132 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-uds\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:19:28.921294 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:28.921141 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fe8610a-b55b-42e7-96cf-45b55731bd30-tokenizer-tmp\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:19:30.808459 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:30.808423 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" path="/var/lib/kubelet/pods/6fe8610a-b55b-42e7-96cf-45b55731bd30/volumes" Apr 17 08:19:37.786035 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.786001 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj"] Apr 17 08:19:37.786424 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.786404 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="tokenizer" Apr 17 08:19:37.786424 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.786422 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="tokenizer" Apr 17 08:19:37.786424 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.786431 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="storage-initializer" Apr 17 08:19:37.786618 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.786436 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="storage-initializer" Apr 17 08:19:37.786618 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.786449 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="main" Apr 17 08:19:37.786618 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.786456 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="main" Apr 17 08:19:37.786618 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.786546 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="main" Apr 17 08:19:37.786618 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.786562 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fe8610a-b55b-42e7-96cf-45b55731bd30" containerName="tokenizer" Apr 17 08:19:37.791805 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.791783 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:37.795619 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.795590 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-hbpgs\"" Apr 17 08:19:37.795793 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.795658 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jj9s6\"" Apr 17 08:19:37.795793 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.795598 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 17 08:19:37.798101 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.798079 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj"] Apr 17 08:19:37.901310 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.901274 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:37.901499 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.901358 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:37.901499 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.901384 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:37.901499 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.901413 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:37.901499 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.901453 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:37.901499 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:37.901483 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtpj2\" (UniqueName: \"kubernetes.io/projected/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kube-api-access-vtpj2\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.002867 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.002829 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.002867 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.002869 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.003138 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.002987 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.003138 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.003012 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.003138 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.003039 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtpj2\" (UniqueName: \"kubernetes.io/projected/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kube-api-access-vtpj2\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.003138 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.003063 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.003356 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.003331 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.003442 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.003420 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.003442 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.003434 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.003537 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.003519 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.005515 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.005489 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.011883 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.011860 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtpj2\" (UniqueName: \"kubernetes.io/projected/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kube-api-access-vtpj2\") pod \"router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.102592 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.102495 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:38.250927 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.250901 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj"] Apr 17 08:19:38.253261 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:19:38.253232 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode89a7abe_f3a8_4787_873e_5fd2d4cf67d8.slice/crio-fdd413ecee5ab77d04db3d996ec95ccad2b6fe76cc4920bef9f215da7d5aac24 WatchSource:0}: Error finding container fdd413ecee5ab77d04db3d996ec95ccad2b6fe76cc4920bef9f215da7d5aac24: Status 404 returned error can't find the container with id fdd413ecee5ab77d04db3d996ec95ccad2b6fe76cc4920bef9f215da7d5aac24 Apr 17 08:19:38.913988 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.913951 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" event={"ID":"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8","Type":"ContainerStarted","Data":"da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda"} Apr 17 08:19:38.913988 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:38.913989 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" event={"ID":"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8","Type":"ContainerStarted","Data":"fdd413ecee5ab77d04db3d996ec95ccad2b6fe76cc4920bef9f215da7d5aac24"} Apr 17 08:19:39.918364 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:39.918321 2566 generic.go:358] "Generic (PLEG): container finished" podID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerID="da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda" exitCode=0 Apr 17 08:19:39.918770 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:39.918406 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" event={"ID":"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8","Type":"ContainerDied","Data":"da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda"} Apr 17 08:19:40.924848 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:40.924814 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" event={"ID":"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8","Type":"ContainerStarted","Data":"e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d"} Apr 17 08:19:40.924848 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:40.924849 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" event={"ID":"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8","Type":"ContainerStarted","Data":"1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003"} Apr 17 08:19:40.925269 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:40.924953 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:40.946780 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:40.946730 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" podStartSLOduration=3.946714385 podStartE2EDuration="3.946714385s" podCreationTimestamp="2026-04-17 08:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:19:40.945856909 +0000 UTC m=+1684.733635526" watchObservedRunningTime="2026-04-17 08:19:40.946714385 +0000 UTC m=+1684.734492979" Apr 17 08:19:48.103474 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:48.103381 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:48.103474 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:48.103444 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:48.106333 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:48.106307 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:19:48.956514 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:19:48.956484 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:20:09.960975 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:20:09.960945 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:21:36.858927 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:21:36.858895 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:21:36.864070 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:21:36.864049 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:21:36.864352 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:21:36.864333 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:21:36.869102 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:21:36.869081 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:22:08.822745 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:08.822714 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj"] Apr 17 08:22:08.823195 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:08.823018 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="main" containerID="cri-o://1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003" gracePeriod=30 Apr 17 08:22:08.823195 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:08.823039 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="tokenizer" containerID="cri-o://e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d" gracePeriod=30 Apr 17 08:22:08.956154 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:08.956105 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.49:8082/healthz\": dial tcp 10.132.0.49:8082: connect: connection refused" Apr 17 08:22:09.456622 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:09.456585 2566 generic.go:358] "Generic (PLEG): container finished" podID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerID="1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003" exitCode=0 Apr 17 08:22:09.456854 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:09.456654 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" event={"ID":"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8","Type":"ContainerDied","Data":"1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003"} Apr 17 08:22:09.960523 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:22:09.960485 2566 logging.go:55] [core] [Channel #472 SubChannel #473]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.49:9003", ServerName: "10.132.0.49:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.49:9003: connect: connection refused" Apr 17 08:22:10.172806 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.172779 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:22:10.337894 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.337863 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tls-certs\") pod \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " Apr 17 08:22:10.338082 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.337927 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtpj2\" (UniqueName: \"kubernetes.io/projected/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kube-api-access-vtpj2\") pod \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " Apr 17 08:22:10.338082 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.338030 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-cache\") pod \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " Apr 17 08:22:10.338082 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.338057 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-tmp\") pod \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " Apr 17 08:22:10.338261 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.338087 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kserve-provision-location\") pod \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " Apr 17 08:22:10.338261 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.338132 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-uds\") pod \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\" (UID: \"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8\") " Apr 17 08:22:10.338368 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.338322 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" (UID: "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:22:10.338491 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.338470 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-cache\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:22:10.338570 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.338470 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" (UID: "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:22:10.338570 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.338486 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" (UID: "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:22:10.338881 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.338857 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" (UID: "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:22:10.340094 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.340071 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" (UID: "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:22:10.340197 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.340073 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kube-api-access-vtpj2" (OuterVolumeSpecName: "kube-api-access-vtpj2") pod "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" (UID: "e89a7abe-f3a8-4787-873e-5fd2d4cf67d8"). InnerVolumeSpecName "kube-api-access-vtpj2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:22:10.440006 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.439970 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-tmp\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:22:10.440006 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.440000 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:22:10.440006 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.440011 2566 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tokenizer-uds\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:22:10.440006 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.440020 2566 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-tls-certs\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:22:10.440286 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.440028 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtpj2\" (UniqueName: \"kubernetes.io/projected/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8-kube-api-access-vtpj2\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 08:22:10.461831 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.461794 2566 generic.go:358] "Generic (PLEG): container finished" podID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerID="e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d" exitCode=0 Apr 17 08:22:10.461990 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.461867 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" Apr 17 08:22:10.461990 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.461865 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" event={"ID":"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8","Type":"ContainerDied","Data":"e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d"} Apr 17 08:22:10.461990 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.461914 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" event={"ID":"e89a7abe-f3a8-4787-873e-5fd2d4cf67d8","Type":"ContainerDied","Data":"fdd413ecee5ab77d04db3d996ec95ccad2b6fe76cc4920bef9f215da7d5aac24"} Apr 17 08:22:10.461990 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.461931 2566 scope.go:117] "RemoveContainer" containerID="e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d" Apr 17 08:22:10.476487 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.476464 2566 scope.go:117] "RemoveContainer" containerID="1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003" Apr 17 08:22:10.487370 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.487343 2566 scope.go:117] "RemoveContainer" containerID="da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda" Apr 17 08:22:10.490460 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.490434 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj"] Apr 17 08:22:10.492404 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.492376 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj"] Apr 17 08:22:10.496109 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.496089 2566 scope.go:117] "RemoveContainer" containerID="e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d" Apr 17 08:22:10.496397 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:22:10.496377 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d\": container with ID starting with e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d not found: ID does not exist" containerID="e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d" Apr 17 08:22:10.496450 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.496405 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d"} err="failed to get container status \"e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d\": rpc error: code = NotFound desc = could not find container \"e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d\": container with ID starting with e02c8c9ec3c80c79524c2aebfd4c58f5132cb00a550ad3a1b08024bd0b5a686d not found: ID does not exist" Apr 17 08:22:10.496450 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.496425 2566 scope.go:117] "RemoveContainer" containerID="1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003" Apr 17 08:22:10.496674 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:22:10.496651 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003\": container with ID starting with 1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003 not found: ID does not exist" containerID="1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003" Apr 17 08:22:10.496813 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.496686 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003"} err="failed to get container status \"1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003\": rpc error: code = NotFound desc = could not find container \"1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003\": container with ID starting with 1f1d28eeb7a343a9c81f5be11f8a8b79f1186e9718bf9bf7293d492c10e74003 not found: ID does not exist" Apr 17 08:22:10.496896 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.496818 2566 scope.go:117] "RemoveContainer" containerID="da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda" Apr 17 08:22:10.497052 ip-10-0-128-217 kubenswrapper[2566]: E0417 08:22:10.497036 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda\": container with ID starting with da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda not found: ID does not exist" containerID="da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda" Apr 17 08:22:10.497101 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.497058 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda"} err="failed to get container status \"da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda\": rpc error: code = NotFound desc = could not find container \"da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda\": container with ID starting with da552767d15052bab048266678b95250f057875890964dd66a9775c4a25cecda not found: ID does not exist" Apr 17 08:22:10.807981 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.807951 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" path="/var/lib/kubelet/pods/e89a7abe-f3a8-4787-873e-5fd2d4cf67d8/volumes" Apr 17 08:22:10.960862 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:10.960818 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-9bd5f9f9fccclj" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.49:9003\" within 1s: context deadline exceeded" Apr 17 08:22:24.405138 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:24.405108 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:25.433707 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:25.433666 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:26.469038 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:26.469012 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:27.461123 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:27.461093 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:28.448151 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:28.448121 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:29.483014 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:29.482977 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:30.460667 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:30.460636 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:31.463549 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:31.463523 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:32.467577 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:32.467547 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:33.466175 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:33.466146 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:34.469587 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:34.469556 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:35.487331 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:35.487302 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:36.472943 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:36.472911 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:37.529948 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:37.529902 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-j7t8k_ae3a165d-26cd-42e0-801f-5520ea324e30/istio-proxy/0.log" Apr 17 08:22:38.606466 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:38.606438 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-hwfxc_a35ffed7-a2c6-4c47-8b54-34a98ed44373/discovery/0.log" Apr 17 08:22:38.620534 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:38.620507 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-22t5t_b23ca74c-0031-4851-93e2-a6d737874ca7/istio-proxy/0.log" Apr 17 08:22:38.639298 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:38.639274 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-78488888b4-7t88f_beafd2af-5cb4-49eb-994d-2cdebba6b9ea/router/0.log" Apr 17 08:22:39.402620 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:39.402587 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-hwfxc_a35ffed7-a2c6-4c47-8b54-34a98ed44373/discovery/0.log" Apr 17 08:22:39.415132 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:39.415106 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-22t5t_b23ca74c-0031-4851-93e2-a6d737874ca7/istio-proxy/0.log" Apr 17 08:22:39.434664 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:39.434639 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-78488888b4-7t88f_beafd2af-5cb4-49eb-994d-2cdebba6b9ea/router/0.log" Apr 17 08:22:40.239325 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:40.239296 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-dcs2n_fd2ce257-5495-4514-a17d-3c85c0dcb68f/kuadrant-console-plugin/0.log" Apr 17 08:22:40.274501 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:40.274472 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-shxl7_5924881c-7eaf-4d8e-8d87-5c56cda01132/manager/0.log" Apr 17 08:22:45.241985 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:45.241904 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5rcmt_792796b1-b737-4fd3-820e-247120d1de83/global-pull-secret-syncer/0.log" Apr 17 08:22:45.323260 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:45.323230 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-42f72_7886ccc8-69b5-457b-8e19-3ac5c6d1153c/konnectivity-agent/0.log" Apr 17 08:22:45.395748 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:45.395716 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-217.ec2.internal_57a323f22f4a50ec542cb175406e5b82/haproxy/0.log" Apr 17 08:22:49.494083 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:49.494051 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-dcs2n_fd2ce257-5495-4514-a17d-3c85c0dcb68f/kuadrant-console-plugin/0.log" Apr 17 08:22:49.539891 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:49.539857 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-shxl7_5924881c-7eaf-4d8e-8d87-5c56cda01132/manager/0.log" Apr 17 08:22:50.548863 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:50.548834 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_33f771cd-9a20-42f3-b071-e8907d147087/alertmanager/0.log" Apr 17 08:22:50.573331 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:50.573303 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_33f771cd-9a20-42f3-b071-e8907d147087/config-reloader/0.log" Apr 17 08:22:50.592743 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:50.592717 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_33f771cd-9a20-42f3-b071-e8907d147087/kube-rbac-proxy-web/0.log" Apr 17 08:22:50.628150 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:50.628124 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_33f771cd-9a20-42f3-b071-e8907d147087/kube-rbac-proxy/0.log" Apr 17 08:22:50.647688 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:50.647667 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_33f771cd-9a20-42f3-b071-e8907d147087/kube-rbac-proxy-metric/0.log" Apr 17 08:22:50.669288 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:50.669267 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_33f771cd-9a20-42f3-b071-e8907d147087/prom-label-proxy/0.log" Apr 17 08:22:50.688115 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:50.688093 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_33f771cd-9a20-42f3-b071-e8907d147087/init-config-reloader/0.log" Apr 17 08:22:50.928602 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:50.928529 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pg89w_97ea7ab8-8bd4-4f02-af41-c58d6ec791ea/node-exporter/0.log" Apr 17 08:22:50.949267 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:50.949238 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pg89w_97ea7ab8-8bd4-4f02-af41-c58d6ec791ea/kube-rbac-proxy/0.log" Apr 17 08:22:50.969757 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:50.969732 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pg89w_97ea7ab8-8bd4-4f02-af41-c58d6ec791ea/init-textfile/0.log" Apr 17 08:22:51.321846 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:51.321819 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-26bv9_3a40a7c0-c66f-4136-8ca0-2d73c7171bd4/prometheus-operator/0.log" Apr 17 08:22:51.341026 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:51.341000 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-26bv9_3a40a7c0-c66f-4136-8ca0-2d73c7171bd4/kube-rbac-proxy/0.log" Apr 17 08:22:51.396151 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:51.396125 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6c7b978f98-zt7cg_d45b2571-f6bf-4eba-8a2f-db3a12922e55/telemeter-client/0.log" Apr 17 08:22:51.418302 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:51.418268 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6c7b978f98-zt7cg_d45b2571-f6bf-4eba-8a2f-db3a12922e55/reload/0.log" Apr 17 08:22:51.439295 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:51.439273 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6c7b978f98-zt7cg_d45b2571-f6bf-4eba-8a2f-db3a12922e55/kube-rbac-proxy/0.log" Apr 17 08:22:53.308414 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.308387 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/1.log" Apr 17 08:22:53.312317 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.312294 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2blbh_69ef651a-e48e-4389-a551-48cc6423bcd0/console-operator/2.log" Apr 17 08:22:53.730412 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.730329 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-859974d654-wrtjn_70d0b567-6353-4d11-b0e2-d07d0a107d59/console/0.log" Apr 17 08:22:53.758586 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.758558 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-z9zkf_ca4bdb34-07d3-4fee-8a96-eb085c419679/download-server/0.log" Apr 17 08:22:53.894724 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.894663 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz"] Apr 17 08:22:53.895240 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.895217 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="tokenizer" Apr 17 08:22:53.895240 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.895239 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="tokenizer" Apr 17 08:22:53.895454 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.895262 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="main" Apr 17 08:22:53.895454 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.895271 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="main" Apr 17 08:22:53.895454 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.895320 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="storage-initializer" Apr 17 08:22:53.895454 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.895332 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="storage-initializer" Apr 17 08:22:53.895454 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.895449 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="tokenizer" Apr 17 08:22:53.895614 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.895465 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e89a7abe-f3a8-4787-873e-5fd2d4cf67d8" containerName="main" Apr 17 08:22:53.898373 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.898358 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:53.900864 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.900842 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jc8j6\"/\"openshift-service-ca.crt\"" Apr 17 08:22:53.900969 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.900847 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jc8j6\"/\"kube-root-ca.crt\"" Apr 17 08:22:53.901918 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.901898 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jc8j6\"/\"default-dockercfg-tn4lx\"" Apr 17 08:22:53.907247 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:53.907223 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz"] Apr 17 08:22:54.041444 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.041410 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-lib-modules\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.041608 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.041454 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-podres\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.041608 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.041552 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvh5h\" (UniqueName: \"kubernetes.io/projected/5d0c0dda-5dc6-46db-b4de-68b920069930-kube-api-access-dvh5h\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.041608 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.041582 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-proc\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.041770 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.041662 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-sys\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.142516 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.142484 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-lib-modules\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.142516 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.142521 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-podres\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.142778 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.142643 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-podres\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.142778 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.142656 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-lib-modules\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.142778 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.142660 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvh5h\" (UniqueName: \"kubernetes.io/projected/5d0c0dda-5dc6-46db-b4de-68b920069930-kube-api-access-dvh5h\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.142778 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.142778 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-proc\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.142983 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.142825 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-sys\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.142983 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.142879 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-proc\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.142983 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.142899 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d0c0dda-5dc6-46db-b4de-68b920069930-sys\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.151431 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.151400 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvh5h\" (UniqueName: \"kubernetes.io/projected/5d0c0dda-5dc6-46db-b4de-68b920069930-kube-api-access-dvh5h\") pod \"perf-node-gather-daemonset-wzxvz\" (UID: \"5d0c0dda-5dc6-46db-b4de-68b920069930\") " pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.203419 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.203393 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-jsm28_2d59fab7-6249-4bfb-844f-8cdca44acef2/volume-data-source-validator/0.log" Apr 17 08:22:54.209682 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.209665 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.329228 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.329090 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz"] Apr 17 08:22:54.331977 ip-10-0-128-217 kubenswrapper[2566]: W0417 08:22:54.331947 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5d0c0dda_5dc6_46db_b4de_68b920069930.slice/crio-801e28159a9234e73f297866abbf7c3309d0be4f7bdaf818a16f3925114b9bd6 WatchSource:0}: Error finding container 801e28159a9234e73f297866abbf7c3309d0be4f7bdaf818a16f3925114b9bd6: Status 404 returned error can't find the container with id 801e28159a9234e73f297866abbf7c3309d0be4f7bdaf818a16f3925114b9bd6 Apr 17 08:22:54.333820 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.333802 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:22:54.629270 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.629174 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" event={"ID":"5d0c0dda-5dc6-46db-b4de-68b920069930","Type":"ContainerStarted","Data":"4fc78c3fc763093fee5189c4b0a82a7a41c5620e9261f50d8b2bcf3b2121f779"} Apr 17 08:22:54.629270 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.629210 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" event={"ID":"5d0c0dda-5dc6-46db-b4de-68b920069930","Type":"ContainerStarted","Data":"801e28159a9234e73f297866abbf7c3309d0be4f7bdaf818a16f3925114b9bd6"} Apr 17 08:22:54.629462 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.629273 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:22:54.644098 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.644046 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" podStartSLOduration=1.644031314 podStartE2EDuration="1.644031314s" podCreationTimestamp="2026-04-17 08:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:22:54.643495467 +0000 UTC m=+1878.431274071" watchObservedRunningTime="2026-04-17 08:22:54.644031314 +0000 UTC m=+1878.431809918" Apr 17 08:22:54.957253 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.957185 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rltvj_b863c142-c069-46ab-9031-2f50beeb3f53/dns/0.log" Apr 17 08:22:54.975096 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.975069 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rltvj_b863c142-c069-46ab-9031-2f50beeb3f53/kube-rbac-proxy/0.log" Apr 17 08:22:54.994600 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:54.994572 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qndp2_c9597c98-928c-4e9e-9f6a-20399532f672/dns-node-resolver/0.log" Apr 17 08:22:55.535795 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:55.535762 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s2dk9_f847aabc-a956-47e9-91e9-a380ac142ed4/node-ca/0.log" Apr 17 08:22:56.349644 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:56.349610 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-hwfxc_a35ffed7-a2c6-4c47-8b54-34a98ed44373/discovery/0.log" Apr 17 08:22:56.373525 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:56.373485 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-22t5t_b23ca74c-0031-4851-93e2-a6d737874ca7/istio-proxy/0.log" Apr 17 08:22:56.401788 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:56.401751 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-78488888b4-7t88f_beafd2af-5cb4-49eb-994d-2cdebba6b9ea/router/0.log" Apr 17 08:22:56.889666 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:56.889627 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qvwww_2354a35f-6d75-4e6e-a614-0e68c4002cb7/serve-healthcheck-canary/0.log" Apr 17 08:22:57.299379 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:57.299344 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-kqzwd_86c75543-1a40-464a-bce1-5fe690add66f/insights-operator/0.log" Apr 17 08:22:57.300394 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:57.300375 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-kqzwd_86c75543-1a40-464a-bce1-5fe690add66f/insights-operator/1.log" Apr 17 08:22:57.381581 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:57.381541 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m79hw_62634055-259b-4f42-804e-3c91faf18087/kube-rbac-proxy/0.log" Apr 17 08:22:57.399747 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:57.399724 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m79hw_62634055-259b-4f42-804e-3c91faf18087/exporter/0.log" Apr 17 08:22:57.419103 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:22:57.419060 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m79hw_62634055-259b-4f42-804e-3c91faf18087/extractor/0.log" Apr 17 08:23:00.031443 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:00.031410 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-bq9gz_84f87d67-a0cd-4783-ae58-8c3cf22700be/openshift-lws-operator/0.log" Apr 17 08:23:00.510656 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:00.510582 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-558564fd68-6vcd4_eb92f7ad-35c9-40ee-8463-f11c96eeb371/manager/0.log" Apr 17 08:23:00.556970 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:00.556939 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-bbd45769b-sbw58_e452f4ee-4679-46e5-9c7d-0e11885c9419/manager/0.log" Apr 17 08:23:00.645317 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:00.645289 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jc8j6/perf-node-gather-daemonset-wzxvz" Apr 17 08:23:05.658945 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:05.658852 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dxszq_f5648bed-82b5-4b80-8d08-2e781e7705fc/kube-storage-version-migrator-operator/1.log" Apr 17 08:23:05.659671 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:05.659636 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dxszq_f5648bed-82b5-4b80-8d08-2e781e7705fc/kube-storage-version-migrator-operator/0.log" Apr 17 08:23:06.707068 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:06.707039 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b24kh_595aad98-ad8c-469e-ae13-798099e8e67b/kube-multus-additional-cni-plugins/0.log" Apr 17 08:23:06.726348 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:06.726309 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b24kh_595aad98-ad8c-469e-ae13-798099e8e67b/egress-router-binary-copy/0.log" Apr 17 08:23:06.746237 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:06.746210 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b24kh_595aad98-ad8c-469e-ae13-798099e8e67b/cni-plugins/0.log" Apr 17 08:23:06.763251 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:06.763228 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b24kh_595aad98-ad8c-469e-ae13-798099e8e67b/bond-cni-plugin/0.log" Apr 17 08:23:06.780502 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:06.780480 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b24kh_595aad98-ad8c-469e-ae13-798099e8e67b/routeoverride-cni/0.log" Apr 17 08:23:06.798537 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:06.798515 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b24kh_595aad98-ad8c-469e-ae13-798099e8e67b/whereabouts-cni-bincopy/0.log" Apr 17 08:23:06.818448 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:06.818427 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b24kh_595aad98-ad8c-469e-ae13-798099e8e67b/whereabouts-cni/0.log" Apr 17 08:23:06.989361 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:06.989287 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr7nk_49cea292-6e01-4497-a4a8-a9cdff76850e/kube-multus/0.log" Apr 17 08:23:07.106957 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:07.106930 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jt4rj_4fbea707-d5c2-4c45-82e5-089d272aa922/network-metrics-daemon/0.log" Apr 17 08:23:07.125300 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:07.125275 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jt4rj_4fbea707-d5c2-4c45-82e5-089d272aa922/kube-rbac-proxy/0.log" Apr 17 08:23:08.254201 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:08.254173 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-controller/0.log" Apr 17 08:23:08.270268 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:08.270221 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/0.log" Apr 17 08:23:08.279310 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:08.279281 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovn-acl-logging/1.log" Apr 17 08:23:08.296857 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:08.296818 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/kube-rbac-proxy-node/0.log" Apr 17 08:23:08.319101 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:08.319063 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:23:08.337035 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:08.337007 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/northd/0.log" Apr 17 08:23:08.354751 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:08.354725 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/nbdb/0.log" Apr 17 08:23:08.373501 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:08.373469 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/sbdb/0.log" Apr 17 08:23:08.478440 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:08.478407 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6vls_69d806e6-4ecb-42c4-b3a9-57107400f8d5/ovnkube-controller/0.log" Apr 17 08:23:09.804663 ip-10-0-128-217 kubenswrapper[2566]: I0417 08:23:09.804633 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vbl24_648c562e-66a8-4487-995e-2e06a13a92a5/network-check-target-container/0.log"