Apr 22 18:34:53.622493 ip-10-0-129-26 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:34:53.622506 ip-10-0-129-26 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:34:53.622513 ip-10-0-129-26 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:34:53.622817 ip-10-0-129-26 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:35:03.802330 ip-10-0-129-26 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:35:03.802347 ip-10-0-129-26 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 83bdfea81cf04db0ab6cd6a0ff8199f7 -- Apr 22 18:37:25.290419 ip-10-0-129-26 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:37:25.671322 ip-10-0-129-26 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:37:25.671322 ip-10-0-129-26 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:37:25.671322 ip-10-0-129-26 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:37:25.671322 ip-10-0-129-26 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:37:25.671322 ip-10-0-129-26 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:37:25.672826 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.672754 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:37:25.675709 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675695 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:25.675709 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675710 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675714 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675717 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675720 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675723 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675726 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675729 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675732 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675734 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675737 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675739 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675742 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675744 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675747 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675750 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675754 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675756 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675759 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675762 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675764 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:25.675776 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675766 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675769 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675771 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675774 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675777 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675780 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675782 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675785 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675788 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675791 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675793 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675796 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675798 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675802 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675804 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675807 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675809 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675812 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675814 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675816 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:25.676295 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675819 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675821 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675824 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675826 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675829 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675834 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675838 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675841 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675847 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675850 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675853 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675856 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675859 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675861 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675864 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675867 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675870 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675872 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675875 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:25.676769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675877 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675880 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675883 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675885 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675888 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675890 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675893 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675895 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675898 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675900 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675902 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675905 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675907 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675909 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675912 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675914 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675917 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675919 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675923 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675925 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675927 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:25.677261 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675933 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675936 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675939 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675941 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.675943 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676386 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676393 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676396 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676399 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676402 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676405 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676408 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676410 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676413 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676415 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676418 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676422 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676426 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676429 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:25.677769 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676431 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676434 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676436 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676439 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676441 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676443 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676446 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676448 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676451 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676453 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676456 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676458 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676460 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676466 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676468 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676471 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676473 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676475 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676478 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676481 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:25.678432 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676484 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676486 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676488 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676491 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676493 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676496 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676498 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676501 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676503 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676505 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676508 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676510 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676512 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676515 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676517 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676520 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676523 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676525 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676528 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:25.679202 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676530 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676533 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676535 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676538 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676540 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676542 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676546 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676550 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676553 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676556 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676558 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676561 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676563 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676565 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676568 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676570 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676572 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676575 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676577 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:25.679702 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676580 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676583 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676585 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676588 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676590 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676593 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676595 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676598 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676600 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676603 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676608 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676611 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676613 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.676616 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677289 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677298 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677304 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677309 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677313 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677316 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677321 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:37:25.680178 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677326 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677329 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677332 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677371 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677377 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677381 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677385 2572 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677388 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677392 2572 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677395 2572 flags.go:64] FLAG: --cloud-config="" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677398 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677401 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677407 2572 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677410 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677413 2572 flags.go:64] FLAG: --config-dir="" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677416 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677419 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677424 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677429 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677432 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677435 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677438 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677441 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677444 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677448 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:37:25.680698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677450 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677455 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677458 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677461 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677464 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677467 2572 flags.go:64] FLAG: --enable-server="true" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677470 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677475 2572 flags.go:64] FLAG: --event-burst="100" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677478 2572 flags.go:64] FLAG: --event-qps="50" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677481 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677485 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677488 2572 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677492 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677495 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677498 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677501 2572 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677504 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677507 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677510 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677512 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677515 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677518 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677521 2572 flags.go:64] FLAG: --feature-gates="" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677525 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677527 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:37:25.681309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677530 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677534 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677537 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677540 2572 flags.go:64] FLAG: --help="false" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677543 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-129-26.ec2.internal" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677546 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677549 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677552 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677555 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677559 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677561 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677564 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677567 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677570 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677573 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677576 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677579 2572 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677582 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677585 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677588 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677591 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677594 2572 flags.go:64] FLAG: --lock-file="" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677597 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677600 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:37:25.681886 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677603 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677608 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677611 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677614 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677617 2572 flags.go:64] FLAG: --logging-format="text" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677620 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677623 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677626 2572 flags.go:64] FLAG: --manifest-url="" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677629 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677634 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677638 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677642 2572 flags.go:64] FLAG: --max-pods="110" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677646 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677648 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677651 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677654 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677657 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677660 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677663 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677671 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677674 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677678 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677681 2572 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:37:25.682479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677683 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677689 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677692 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677696 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677699 2572 flags.go:64] FLAG: --port="10250" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677702 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677705 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0032aa64fbd7c4157" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677708 2572 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677711 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677714 2572 flags.go:64] FLAG: --register-node="true" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677717 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677720 2572 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677724 2572 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677726 2572 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677729 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677732 2572 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677735 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677738 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677741 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677744 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677747 2572 flags.go:64] FLAG: --runonce="false" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677750 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677753 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677756 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677759 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677761 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:37:25.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677764 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677767 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677770 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677774 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677777 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677780 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677783 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677786 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677789 2572 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677796 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677802 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677805 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677808 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677812 2572 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677814 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677817 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677820 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677823 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677826 2572 flags.go:64] FLAG: --v="2" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677830 2572 flags.go:64] FLAG: --version="false" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677834 2572 flags.go:64] FLAG: --vmodule="" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677839 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.677841 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.677941 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.677945 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:25.683675 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.677949 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.677952 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.677988 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678045 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678051 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678056 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678059 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678064 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678067 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678069 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678072 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678074 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678077 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678080 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678084 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678089 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678100 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678104 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678109 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678113 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:25.684399 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678117 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678192 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678206 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678212 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678217 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678222 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678228 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678233 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678237 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678242 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678246 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678250 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678255 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678259 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678270 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678275 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678279 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678284 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678288 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678292 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:25.685343 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678296 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678300 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678304 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678309 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678313 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678318 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678328 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678335 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678341 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678346 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678351 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678357 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678361 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678366 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678370 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678376 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678383 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678387 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:25.686220 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678396 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678401 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678405 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678410 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678414 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678418 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678422 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678426 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678431 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678434 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678439 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678443 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678452 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678457 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678460 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678464 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678469 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678473 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678477 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678481 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678485 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:25.686992 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678489 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678493 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678497 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678501 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.678510 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.679104 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.686341 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.686361 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686429 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686437 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686442 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686447 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686451 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686456 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686461 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:25.687813 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686465 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686469 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686473 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686477 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686481 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686485 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686489 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686494 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686498 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686503 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686507 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686511 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686516 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686520 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686524 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686528 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686533 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686537 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686542 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686547 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:25.688291 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686551 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686555 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686559 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686589 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686597 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686602 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686607 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686611 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686616 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686621 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686625 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686629 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686633 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686637 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686642 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686646 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686650 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686654 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686660 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:25.688862 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686667 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686674 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686680 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686684 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686689 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686693 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686697 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686702 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686706 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686710 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686715 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686719 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686723 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686728 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686732 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686736 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686741 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686746 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686751 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:25.689662 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686755 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686760 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686764 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686768 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686772 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686776 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686781 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686785 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686789 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686793 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686798 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686802 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686807 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686811 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686815 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686819 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686823 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686827 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686831 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686835 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:25.690426 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.686840 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.686848 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687000 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687009 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687013 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687018 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687023 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687027 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687032 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687036 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687041 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687045 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687049 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687053 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687057 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687062 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:25.691043 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687066 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687071 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687075 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687079 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687083 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687087 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687092 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687096 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687100 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687104 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687108 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687113 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687117 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687140 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687156 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687160 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687165 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687168 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687172 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:25.691479 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687176 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687179 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687183 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687187 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687191 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687195 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687199 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687203 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687207 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687211 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687215 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687219 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687224 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687228 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687231 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687235 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687239 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687244 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687248 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687252 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:25.691976 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687256 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687261 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687265 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687269 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687275 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687279 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687286 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687292 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687297 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687302 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687307 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687311 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687315 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687319 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687323 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687328 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687333 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687337 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687341 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:25.692516 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687345 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687349 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687353 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687357 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687361 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687366 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687370 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687374 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687377 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687381 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687385 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687389 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687393 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:25.687397 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.687405 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:37:25.692968 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.688090 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:37:25.693469 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.690801 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:37:25.693469 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.691719 2572 server.go:1019] "Starting client certificate rotation" Apr 22 18:37:25.693469 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.691820 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:37:25.693469 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.691852 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:37:25.711696 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.711673 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:37:25.714005 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.713988 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:37:25.724720 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.724701 2572 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:37:25.730258 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.730234 2572 log.go:25] "Validated CRI v1 image API" Apr 22 18:37:25.732720 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.732704 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:37:25.736672 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.736653 2572 fs.go:135] Filesystem UUIDs: map[7987be41-c760-4824-ad70-69e906d2fd31:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8d0df785-6433-40b0-a5fe-733d26669cad:/dev/nvme0n1p3] Apr 22 18:37:25.736724 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.736674 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:37:25.740671 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.740652 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:37:25.744524 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.744420 2572 manager.go:217] Machine: {Timestamp:2026-04-22 18:37:25.742696907 +0000 UTC m=+0.346850902 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3106266 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec262fce4a3a063672d29681b77eca6d SystemUUID:ec262fce-4a3a-0636-72d2-9681b77eca6d BootID:83bdfea8-1cf0-4db0-ab6c-d6a0ff8199f7 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f1:8a:46:05:5b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f1:8a:46:05:5b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:aa:38:1a:cf:5d:3a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:37:25.744524 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.744519 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:37:25.744631 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.744589 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:37:25.746167 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.746142 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:37:25.746294 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.746170 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-26.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:37:25.746340 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.746303 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:37:25.746340 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.746311 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:37:25.746340 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.746324 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:37:25.746966 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.746956 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:37:25.748308 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.748298 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:37:25.748417 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.748408 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:37:25.750405 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.750396 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:37:25.750436 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.750409 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:37:25.750436 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.750420 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:37:25.750436 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.750428 2572 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:37:25.750521 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.750439 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:37:25.751353 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.751342 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:37:25.751391 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.751360 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:37:25.753726 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.753706 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:37:25.755411 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.755399 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:37:25.756538 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756525 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:37:25.756570 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756548 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:37:25.756570 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756558 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:37:25.756570 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756567 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:37:25.756675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756574 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:37:25.756675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756583 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:37:25.756675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756593 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:37:25.756675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756601 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:37:25.756675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756609 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:37:25.756675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756619 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:37:25.756675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756628 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:37:25.756675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.756637 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:37:25.757319 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.757310 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:37:25.757319 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.757319 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:37:25.760642 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.760629 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:37:25.760818 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.760806 2572 server.go:1295] "Started kubelet" Apr 22 18:37:25.760971 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.760851 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:37:25.761246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.761039 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:37:25.761455 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.761432 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:37:25.761758 ip-10-0-129-26 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:37:25.763114 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.763081 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:37:25.763441 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.763425 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-26.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:37:25.763532 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.763510 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-26.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:37:25.763581 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.763537 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:37:25.763652 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.763529 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:37:25.767964 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.767946 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:37:25.768496 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.768478 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:37:25.768600 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.768508 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:37:25.768600 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.767718 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-26.ec2.internal.18a8c1b73dfa44b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-26.ec2.internal,UID:ip-10-0-129-26.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-26.ec2.internal,},FirstTimestamp:2026-04-22 18:37:25.760648368 +0000 UTC m=+0.364802363,LastTimestamp:2026-04-22 18:37:25.760648368 +0000 UTC m=+0.364802363,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-26.ec2.internal,}" Apr 22 18:37:25.769057 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.769039 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:37:25.769057 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.769043 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:37:25.769183 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.769067 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:37:25.769251 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.769239 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:37:25.769298 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.769268 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:37:25.769432 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.769412 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:25.769916 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.769902 2572 factory.go:55] Registering systemd factory Apr 22 18:37:25.769965 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.769943 2572 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:37:25.770174 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.770163 2572 factory.go:153] Registering CRI-O factory Apr 22 18:37:25.770211 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.770177 2572 factory.go:223] Registration of the crio container factory successfully Apr 22 18:37:25.770258 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.770217 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:37:25.770258 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.770233 2572 factory.go:103] Registering Raw factory Apr 22 18:37:25.770258 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.770242 2572 manager.go:1196] Started watching for new ooms in manager Apr 22 18:37:25.771221 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.771206 2572 manager.go:319] Starting recovery of all containers Apr 22 18:37:25.775721 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.775692 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:37:25.775881 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.775841 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-26.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:37:25.784072 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.783925 2572 manager.go:324] Recovery completed Apr 22 18:37:25.788224 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.788212 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:25.790548 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.790525 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:25.790600 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.790560 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:25.790600 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.790571 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:25.790972 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.790959 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:37:25.790972 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.790970 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:37:25.791049 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.790984 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:37:25.792642 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.792552 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-26.ec2.internal.18a8c1b73fc281aa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-26.ec2.internal,UID:ip-10-0-129-26.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-26.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-26.ec2.internal,},FirstTimestamp:2026-04-22 18:37:25.790548394 +0000 UTC m=+0.394702400,LastTimestamp:2026-04-22 18:37:25.790548394 +0000 UTC m=+0.394702400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-26.ec2.internal,}" Apr 22 18:37:25.792715 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.792700 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9k4nt" Apr 22 18:37:25.793149 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.793137 2572 policy_none.go:49] "None policy: Start" Apr 22 18:37:25.793207 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.793156 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:37:25.793207 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.793169 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:37:25.798388 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.798371 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9k4nt" Apr 22 18:37:25.834514 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.830295 2572 manager.go:341] "Starting Device Plugin manager" Apr 22 18:37:25.834514 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.830321 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:37:25.834514 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.830330 2572 server.go:85] "Starting device plugin registration server" Apr 22 18:37:25.834514 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.830531 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:37:25.834514 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.830544 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:37:25.834514 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.830627 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:37:25.834514 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.830711 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:37:25.834514 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.830721 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:37:25.834514 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.831342 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:37:25.834514 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.831382 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:25.861996 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.861968 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:37:25.863266 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.863253 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:37:25.863327 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.863277 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:37:25.863327 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.863292 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:37:25.863327 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.863298 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:37:25.863327 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.863325 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:37:25.865564 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.865548 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:25.931650 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.931606 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:25.933375 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.933362 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:25.933442 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.933388 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:25.933442 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.933399 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:25.933442 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.933422 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-26.ec2.internal" Apr 22 18:37:25.942247 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.942231 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-26.ec2.internal" Apr 22 18:37:25.942299 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.942250 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-26.ec2.internal\": node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:25.964291 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.964256 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal"] Apr 22 18:37:25.964367 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.964329 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:25.965661 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.965648 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:25.965718 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.965674 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:25.965718 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.965683 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:25.966722 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.966709 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:25.966937 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.966924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" Apr 22 18:37:25.966980 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.966976 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:25.967389 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.967369 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:25.967389 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.967389 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:25.967528 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.967402 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:25.967612 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.967600 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:25.967649 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.967625 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:25.967649 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.967637 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:25.968382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.968369 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal" Apr 22 18:37:25.968427 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.968399 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:25.969418 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.969403 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:25.969515 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.969427 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:25.969515 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:25.969440 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:25.977041 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.977026 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:25.996729 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:25.996710 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-26.ec2.internal\" not found" node="ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.000866 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.000852 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-26.ec2.internal\" not found" node="ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.070012 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.069991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/32c72d2e359afc7dfc63c772815179d7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal\" (UID: \"32c72d2e359afc7dfc63c772815179d7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.070093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.070028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32c72d2e359afc7dfc63c772815179d7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal\" (UID: \"32c72d2e359afc7dfc63c772815179d7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.070093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.070067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/172d194e4a77f6eb5607ce1c62ccac15-config\") pod \"kube-apiserver-proxy-ip-10-0-129-26.ec2.internal\" (UID: \"172d194e4a77f6eb5607ce1c62ccac15\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.077410 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.077396 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:26.170277 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.170253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/32c72d2e359afc7dfc63c772815179d7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal\" (UID: \"32c72d2e359afc7dfc63c772815179d7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.170277 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.170284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32c72d2e359afc7dfc63c772815179d7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal\" (UID: \"32c72d2e359afc7dfc63c772815179d7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.170392 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.170301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/172d194e4a77f6eb5607ce1c62ccac15-config\") pod \"kube-apiserver-proxy-ip-10-0-129-26.ec2.internal\" (UID: \"172d194e4a77f6eb5607ce1c62ccac15\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.170392 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.170324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/172d194e4a77f6eb5607ce1c62ccac15-config\") pod \"kube-apiserver-proxy-ip-10-0-129-26.ec2.internal\" (UID: \"172d194e4a77f6eb5607ce1c62ccac15\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.170392 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.170338 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/32c72d2e359afc7dfc63c772815179d7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal\" (UID: \"32c72d2e359afc7dfc63c772815179d7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.170392 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.170349 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32c72d2e359afc7dfc63c772815179d7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal\" (UID: \"32c72d2e359afc7dfc63c772815179d7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.178354 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.178339 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:26.279182 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.279138 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:26.298312 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.298297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.302641 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.302623 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal" Apr 22 18:37:26.380255 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.380228 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:26.480706 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.480678 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:26.581247 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.581228 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:26.681822 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.681799 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:26.691965 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.691946 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:37:26.692112 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.692096 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:37:26.768211 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.768181 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:37:26.782211 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.782184 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:26.784832 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.784805 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:37:26.800007 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.799954 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:32:25 +0000 UTC" deadline="2027-10-13 00:45:44.952146186 +0000 UTC" Apr 22 18:37:26.800110 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.800011 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12918h8m18.152142085s" Apr 22 18:37:26.809364 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.809346 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4lt4z" Apr 22 18:37:26.812574 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.812557 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:26.817932 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.817915 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4lt4z" Apr 22 18:37:26.882599 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.882517 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:26.903156 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:26.903116 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c72d2e359afc7dfc63c772815179d7.slice/crio-d0dd74c4d31c1968454b36b23f1b4d589f24eda4ca6aba17574e8e350792a894 WatchSource:0}: Error finding container d0dd74c4d31c1968454b36b23f1b4d589f24eda4ca6aba17574e8e350792a894: Status 404 returned error can't find the container with id d0dd74c4d31c1968454b36b23f1b4d589f24eda4ca6aba17574e8e350792a894 Apr 22 18:37:26.903463 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:26.903442 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod172d194e4a77f6eb5607ce1c62ccac15.slice/crio-9a3a7c47577ebe78e0dee67424960a3e51f66ecd1d024616ffc83e34eccdc797 WatchSource:0}: Error finding container 9a3a7c47577ebe78e0dee67424960a3e51f66ecd1d024616ffc83e34eccdc797: Status 404 returned error can't find the container with id 9a3a7c47577ebe78e0dee67424960a3e51f66ecd1d024616ffc83e34eccdc797 Apr 22 18:37:26.907369 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:26.907355 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:37:26.983559 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:26.983517 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:27.084046 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.084010 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-26.ec2.internal\" not found" Apr 22 18:37:27.087631 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.087616 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:27.169782 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.169718 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" Apr 22 18:37:27.186043 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.186020 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:37:27.187490 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.187475 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal" Apr 22 18:37:27.193763 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.193747 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:37:27.345985 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.345955 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:27.751612 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.751583 2572 apiserver.go:52] "Watching apiserver" Apr 22 18:37:27.758461 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.758436 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:37:27.758851 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.758822 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7svrr","openshift-image-registry/node-ca-nx5jx","openshift-multus/multus-f8f9s","openshift-network-diagnostics/network-check-target-4fnlv","openshift-network-operator/iptables-alerter-mzkcf","kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq","openshift-cluster-node-tuning-operator/tuned-ksw4k","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal","openshift-multus/multus-additional-cni-plugins-9md8j","openshift-multus/network-metrics-daemon-9cw24","openshift-ovn-kubernetes/ovnkube-node-gwnpr","kube-system/global-pull-secret-syncer-k99dx","kube-system/konnectivity-agent-vtzpc"] Apr 22 18:37:27.761993 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.761972 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.762975 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.762958 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.763939 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.763918 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:27.764031 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.763986 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:27.764370 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.764339 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zn9tj\"" Apr 22 18:37:27.764632 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.764347 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:37:27.764804 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.764461 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:37:27.764870 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.764462 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:37:27.765076 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.765058 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:37:27.765296 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.765281 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:37:27.765370 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.765305 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kb59n\"" Apr 22 18:37:27.769281 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.766035 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:27.769281 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.766426 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:27.769281 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.768154 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:37:27.769281 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.768246 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.769281 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.768918 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:37:27.775666 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.775645 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:27.775768 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.775679 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.778158 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.777356 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:27.778158 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.777449 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:27.778158 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.777982 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r2fpj\"" Apr 22 18:37:27.778158 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.778096 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:37:27.778953 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.778905 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7f7j\" (UniqueName: \"kubernetes.io/projected/d748c1b5-2936-4573-b185-b6e9da61d104-kube-api-access-z7f7j\") pod \"node-ca-nx5jx\" (UID: \"d748c1b5-2936-4573-b185-b6e9da61d104\") " pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:27.779034 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.778970 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-sys-fs\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.779034 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779006 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:37:27.779106 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779098 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pchrd\"" Apr 22 18:37:27.779206 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779190 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:37:27.779244 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779008 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.779293 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-modprobe-d\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.779327 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/02a11437-8f2d-4ead-9089-d300b113a946-tmp\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.779406 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-socket-dir-parent\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.779406 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-var-lib-cni-multus\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.779478 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779402 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-system-cni-dir\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.779514 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-device-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.779547 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779528 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-sys\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.779584 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779559 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rmbl\" (UniqueName: \"kubernetes.io/projected/02a11437-8f2d-4ead-9089-d300b113a946-kube-api-access-2rmbl\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.779584 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779559 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-zjrcb\"" Apr 22 18:37:27.779648 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhjw\" (UniqueName: \"kubernetes.io/projected/482a486e-af9c-4b33-8137-5fbd716f3fc2-kube-api-access-kjhjw\") pod \"node-resolver-7svrr\" (UID: \"482a486e-af9c-4b33-8137-5fbd716f3fc2\") " pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:27.779689 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779657 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:37:27.779837 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779823 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.779874 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779862 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:37:27.779944 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-cnibin\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.780013 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.779999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-conf-dir\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.780050 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780036 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-systemd\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780290 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c12a961-b77e-409b-8712-88752c895e8c-host-slash\") pod \"iptables-alerter-mzkcf\" (UID: \"0c12a961-b77e-409b-8712-88752c895e8c\") " pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780346 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780429 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-system-cni-dir\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-var-lib-kubelet\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780523 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-daemon-config\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780561 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92lfc\" (UniqueName: \"kubernetes.io/projected/ecd4872f-6321-4b86-9954-d839a3dff00d-kube-api-access-92lfc\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780593 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-host\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780674 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/482a486e-af9c-4b33-8137-5fbd716f3fc2-tmp-dir\") pod \"node-resolver-7svrr\" (UID: \"482a486e-af9c-4b33-8137-5fbd716f3fc2\") " pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ecd4872f-6321-4b86-9954-d839a3dff00d-cni-binary-copy\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780747 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780752 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-cnibin\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780805 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s82jd\" (UniqueName: \"kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd\") pod \"network-check-target-4fnlv\" (UID: \"2bfdae94-a7e1-48cd-8cad-b09cbb620c26\") " pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780840 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-registration-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-kubernetes\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780915 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zclmc\"" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780925 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-cni-dir\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-sysconfig\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.781093 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781019 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-lib-modules\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781030 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781151 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781231 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn6q6\" (UniqueName: \"kubernetes.io/projected/0c12a961-b77e-409b-8712-88752c895e8c-kube-api-access-sn6q6\") pod \"iptables-alerter-mzkcf\" (UID: \"0c12a961-b77e-409b-8712-88752c895e8c\") " pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.780748 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pjkgq\"" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-os-release\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781340 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-var-lib-kubelet\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b646830-7c62-445d-8fa2-df72174ec0c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781432 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b646830-7c62-445d-8fa2-df72174ec0c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-etc-selinux\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781464 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781490 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-sysctl-d\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781521 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/02a11437-8f2d-4ead-9089-d300b113a946-etc-tuned\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781579 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-etc-kubernetes\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781655 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781882 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-run\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.781981 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-os-release\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.782038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-run-netns\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.782456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.782221 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:27.783349 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.782627 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:27.783349 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783096 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-var-lib-cni-bin\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.783349 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783179 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:37:27.783505 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-hostroot\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.783505 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-run-k8s-cni-cncf-io\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.783505 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783461 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d748c1b5-2936-4573-b185-b6e9da61d104-serviceca\") pod \"node-ca-nx5jx\" (UID: \"d748c1b5-2936-4573-b185-b6e9da61d104\") " pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:27.783505 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783486 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-socket-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.783505 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783490 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:37:27.783733 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783579 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2wpvl\"" Apr 22 18:37:27.783733 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783665 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj9vg\" (UniqueName: \"kubernetes.io/projected/5acb96e2-7093-4837-af37-db4c13f5ea38-kube-api-access-rj9vg\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.783733 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-sysctl-conf\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.783733 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783730 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:37:27.783927 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783735 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/482a486e-af9c-4b33-8137-5fbd716f3fc2-hosts-file\") pod \"node-resolver-7svrr\" (UID: \"482a486e-af9c-4b33-8137-5fbd716f3fc2\") " pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:27.783927 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783762 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b646830-7c62-445d-8fa2-df72174ec0c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.783927 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783789 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmvz\" (UniqueName: \"kubernetes.io/projected/5b646830-7c62-445d-8fa2-df72174ec0c3-kube-api-access-vfmvz\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.783927 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783815 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-run-multus-certs\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.783927 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783862 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d748c1b5-2936-4573-b185-b6e9da61d104-host\") pod \"node-ca-nx5jx\" (UID: \"d748c1b5-2936-4573-b185-b6e9da61d104\") " pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:27.783927 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783870 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:27.783927 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.783887 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c12a961-b77e-409b-8712-88752c895e8c-iptables-alerter-script\") pod \"iptables-alerter-mzkcf\" (UID: \"0c12a961-b77e-409b-8712-88752c895e8c\") " pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:27.786192 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.785866 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:37:27.786192 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.786086 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:37:27.786309 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.786267 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:37:27.786546 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.786532 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dmt8l\"" Apr 22 18:37:27.786774 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.786757 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:37:27.786774 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.786773 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:37:27.818604 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.818577 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:32:26 +0000 UTC" deadline="2027-12-08 21:19:21.767011851 +0000 UTC" Apr 22 18:37:27.818690 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.818604 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14282h41m53.948411685s" Apr 22 18:37:27.868202 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.868155 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal" event={"ID":"172d194e4a77f6eb5607ce1c62ccac15","Type":"ContainerStarted","Data":"9a3a7c47577ebe78e0dee67424960a3e51f66ecd1d024616ffc83e34eccdc797"} Apr 22 18:37:27.869395 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.869366 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" event={"ID":"32c72d2e359afc7dfc63c772815179d7","Type":"ContainerStarted","Data":"d0dd74c4d31c1968454b36b23f1b4d589f24eda4ca6aba17574e8e350792a894"} Apr 22 18:37:27.869996 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.869955 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:37:27.884128 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-socket-dir-parent\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.884241 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-var-lib-cni-multus\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.884241 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-system-cni-dir\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.884241 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-device-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.884241 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-sys\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.884413 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rmbl\" (UniqueName: \"kubernetes.io/projected/02a11437-8f2d-4ead-9089-d300b113a946-kube-api-access-2rmbl\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.884413 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-var-lib-cni-multus\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.884413 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884278 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-socket-dir-parent\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.884413 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884288 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-kubelet\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.884413 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhjw\" (UniqueName: \"kubernetes.io/projected/482a486e-af9c-4b33-8137-5fbd716f3fc2-kube-api-access-kjhjw\") pod \"node-resolver-7svrr\" (UID: \"482a486e-af9c-4b33-8137-5fbd716f3fc2\") " pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:27.884413 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-cnibin\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.884413 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-sys\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.884413 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-system-cni-dir\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.884413 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884399 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-cnibin\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.884770 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884426 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-device-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.884770 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-node-log\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.884770 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884554 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfz9s\" (UniqueName: \"kubernetes.io/projected/3445bb02-600f-40d2-95c6-d4616e5630c5-kube-api-access-lfz9s\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:27.884770 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-conf-dir\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.884770 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884623 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-systemd\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.884770 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884649 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-etc-openvswitch\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.884770 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-conf-dir\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.884770 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-systemd\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.884770 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f31d871d-61d4-4d54-a317-0264d5e9f432-ovn-node-metrics-cert\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c12a961-b77e-409b-8712-88752c895e8c-host-slash\") pod \"iptables-alerter-mzkcf\" (UID: \"0c12a961-b77e-409b-8712-88752c895e8c\") " pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-system-cni-dir\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-var-lib-kubelet\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-daemon-config\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c12a961-b77e-409b-8712-88752c895e8c-host-slash\") pod \"iptables-alerter-mzkcf\" (UID: \"0c12a961-b77e-409b-8712-88752c895e8c\") " pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884867 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-system-cni-dir\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884869 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92lfc\" (UniqueName: \"kubernetes.io/projected/ecd4872f-6321-4b86-9954-d839a3dff00d-kube-api-access-92lfc\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-host\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-var-lib-kubelet\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884931 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884957 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwfrk\" (UniqueName: \"kubernetes.io/projected/f31d871d-61d4-4d54-a317-0264d5e9f432-kube-api-access-gwfrk\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-host\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.884983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/482a486e-af9c-4b33-8137-5fbd716f3fc2-tmp-dir\") pod \"node-resolver-7svrr\" (UID: \"482a486e-af9c-4b33-8137-5fbd716f3fc2\") " pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885007 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ecd4872f-6321-4b86-9954-d839a3dff00d-cni-binary-copy\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-cnibin\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s82jd\" (UniqueName: \"kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd\") pod \"network-check-target-4fnlv\" (UID: \"2bfdae94-a7e1-48cd-8cad-b09cbb620c26\") " pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:27.885216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-registration-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-kubernetes\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/892816cf-4bb8-447c-8d94-74a892f6bd4c-konnectivity-ca\") pod \"konnectivity-agent-vtzpc\" (UID: \"892816cf-4bb8-447c-8d94-74a892f6bd4c\") " pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885169 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-cnibin\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f31d871d-61d4-4d54-a317-0264d5e9f432-ovnkube-config\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885238 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-kubernetes\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885457 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/482a486e-af9c-4b33-8137-5fbd716f3fc2-tmp-dir\") pod \"node-resolver-7svrr\" (UID: \"482a486e-af9c-4b33-8137-5fbd716f3fc2\") " pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885640 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-daemon-config\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885731 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-registration-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885731 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ecd4872f-6321-4b86-9954-d839a3dff00d-cni-binary-copy\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885801 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-cni-dir\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-sysconfig\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-multus-cni-dir\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-lib-modules\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885907 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f31d871d-61d4-4d54-a317-0264d5e9f432-env-overrides\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885921 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-sysconfig\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f31d871d-61d4-4d54-a317-0264d5e9f432-ovnkube-script-lib\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.885978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885961 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/af564c1a-d59c-4ed2-ae28-bef2ffad4877-kubelet-config\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.885992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6q6\" (UniqueName: \"kubernetes.io/projected/0c12a961-b77e-409b-8712-88752c895e8c-kube-api-access-sn6q6\") pod \"iptables-alerter-mzkcf\" (UID: \"0c12a961-b77e-409b-8712-88752c895e8c\") " pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-os-release\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-lib-modules\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-var-lib-kubelet\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-run-ovn-kubernetes\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-os-release\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/892816cf-4bb8-447c-8d94-74a892f6bd4c-agent-certs\") pod \"konnectivity-agent-vtzpc\" (UID: \"892816cf-4bb8-447c-8d94-74a892f6bd4c\") " pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886146 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b646830-7c62-445d-8fa2-df72174ec0c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886167 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-var-lib-kubelet\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b646830-7c62-445d-8fa2-df72174ec0c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-etc-selinux\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-sysctl-d\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886248 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/02a11437-8f2d-4ead-9089-d300b113a946-etc-tuned\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886263 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-systemd-units\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-slash\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886290 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-log-socket\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.886765 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-etc-kubernetes\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-run\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886370 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-run-ovn\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886390 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/af564c1a-d59c-4ed2-ae28-bef2ffad4877-dbus\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-sysctl-d\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886404 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-os-release\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-run-netns\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886436 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-var-lib-cni-bin\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-hostroot\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886472 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-run-k8s-cni-cncf-io\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886469 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-etc-selinux\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-run\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d748c1b5-2936-4573-b185-b6e9da61d104-serviceca\") pod \"node-ca-nx5jx\" (UID: \"d748c1b5-2936-4573-b185-b6e9da61d104\") " pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-run-netns\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-socket-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rj9vg\" (UniqueName: \"kubernetes.io/projected/5acb96e2-7093-4837-af37-db4c13f5ea38-kube-api-access-rj9vg\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886587 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-var-lib-cni-bin\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.887607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886610 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-sysctl-conf\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886627 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-run-netns\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886643 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-var-lib-openvswitch\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b646830-7c62-445d-8fa2-df72174ec0c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b646830-7c62-445d-8fa2-df72174ec0c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886628 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-hostroot\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/482a486e-af9c-4b33-8137-5fbd716f3fc2-hosts-file\") pod \"node-resolver-7svrr\" (UID: \"482a486e-af9c-4b33-8137-5fbd716f3fc2\") " pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-run-k8s-cni-cncf-io\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886746 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-os-release\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886709 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/482a486e-af9c-4b33-8137-5fbd716f3fc2-hosts-file\") pod \"node-resolver-7svrr\" (UID: \"482a486e-af9c-4b33-8137-5fbd716f3fc2\") " pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886751 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b646830-7c62-445d-8fa2-df72174ec0c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmvz\" (UniqueName: \"kubernetes.io/projected/5b646830-7c62-445d-8fa2-df72174ec0c3-kube-api-access-vfmvz\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886871 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-run-systemd\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-run-openvswitch\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886955 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-cni-bin\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-socket-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.888246 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-run-multus-certs\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-host-run-multus-certs\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887051 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d748c1b5-2936-4573-b185-b6e9da61d104-serviceca\") pod \"node-ca-nx5jx\" (UID: \"d748c1b5-2936-4573-b185-b6e9da61d104\") " pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887107 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d748c1b5-2936-4573-b185-b6e9da61d104-host\") pod \"node-ca-nx5jx\" (UID: \"d748c1b5-2936-4573-b185-b6e9da61d104\") " pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887145 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecd4872f-6321-4b86-9954-d839a3dff00d-etc-kubernetes\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c12a961-b77e-409b-8712-88752c895e8c-iptables-alerter-script\") pod \"iptables-alerter-mzkcf\" (UID: \"0c12a961-b77e-409b-8712-88752c895e8c\") " pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d748c1b5-2936-4573-b185-b6e9da61d104-host\") pod \"node-ca-nx5jx\" (UID: \"d748c1b5-2936-4573-b185-b6e9da61d104\") " pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887213 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7f7j\" (UniqueName: \"kubernetes.io/projected/d748c1b5-2936-4573-b185-b6e9da61d104-kube-api-access-z7f7j\") pod \"node-ca-nx5jx\" (UID: \"d748c1b5-2936-4573-b185-b6e9da61d104\") " pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-sys-fs\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-modprobe-d\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/02a11437-8f2d-4ead-9089-d300b113a946-tmp\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-cni-netd\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5acb96e2-7093-4837-af37-db4c13f5ea38-sys-fs\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.888918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b646830-7c62-445d-8fa2-df72174ec0c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.889717 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887489 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-modprobe-d\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.889717 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887565 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b646830-7c62-445d-8fa2-df72174ec0c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.889717 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.886871 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/02a11437-8f2d-4ead-9089-d300b113a946-etc-sysctl-conf\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.889717 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.887724 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c12a961-b77e-409b-8712-88752c895e8c-iptables-alerter-script\") pod \"iptables-alerter-mzkcf\" (UID: \"0c12a961-b77e-409b-8712-88752c895e8c\") " pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:27.890190 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.890147 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/02a11437-8f2d-4ead-9089-d300b113a946-etc-tuned\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.890272 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.890234 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/02a11437-8f2d-4ead-9089-d300b113a946-tmp\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.891183 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.891106 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:27.891183 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.891151 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:27.891183 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.891165 2572 projected.go:194] Error preparing data for projected volume kube-api-access-s82jd for pod openshift-network-diagnostics/network-check-target-4fnlv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:27.891604 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.891253 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd podName:2bfdae94-a7e1-48cd-8cad-b09cbb620c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:28.391224392 +0000 UTC m=+2.995378398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s82jd" (UniqueName: "kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd") pod "network-check-target-4fnlv" (UID: "2bfdae94-a7e1-48cd-8cad-b09cbb620c26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:27.893532 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.893464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhjw\" (UniqueName: \"kubernetes.io/projected/482a486e-af9c-4b33-8137-5fbd716f3fc2-kube-api-access-kjhjw\") pod \"node-resolver-7svrr\" (UID: \"482a486e-af9c-4b33-8137-5fbd716f3fc2\") " pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:27.900149 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.898740 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92lfc\" (UniqueName: \"kubernetes.io/projected/ecd4872f-6321-4b86-9954-d839a3dff00d-kube-api-access-92lfc\") pod \"multus-f8f9s\" (UID: \"ecd4872f-6321-4b86-9954-d839a3dff00d\") " pod="openshift-multus/multus-f8f9s" Apr 22 18:37:27.900149 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.899179 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmvz\" (UniqueName: \"kubernetes.io/projected/5b646830-7c62-445d-8fa2-df72174ec0c3-kube-api-access-vfmvz\") pod \"multus-additional-cni-plugins-9md8j\" (UID: \"5b646830-7c62-445d-8fa2-df72174ec0c3\") " pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:27.900149 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.899575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7f7j\" (UniqueName: \"kubernetes.io/projected/d748c1b5-2936-4573-b185-b6e9da61d104-kube-api-access-z7f7j\") pod \"node-ca-nx5jx\" (UID: \"d748c1b5-2936-4573-b185-b6e9da61d104\") " pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:27.900149 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.900019 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj9vg\" (UniqueName: \"kubernetes.io/projected/5acb96e2-7093-4837-af37-db4c13f5ea38-kube-api-access-rj9vg\") pod \"aws-ebs-csi-driver-node-g4rmq\" (UID: \"5acb96e2-7093-4837-af37-db4c13f5ea38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:27.900436 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.900397 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn6q6\" (UniqueName: \"kubernetes.io/projected/0c12a961-b77e-409b-8712-88752c895e8c-kube-api-access-sn6q6\") pod \"iptables-alerter-mzkcf\" (UID: \"0c12a961-b77e-409b-8712-88752c895e8c\") " pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:27.904082 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.904035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rmbl\" (UniqueName: \"kubernetes.io/projected/02a11437-8f2d-4ead-9089-d300b113a946-kube-api-access-2rmbl\") pod \"tuned-ksw4k\" (UID: \"02a11437-8f2d-4ead-9089-d300b113a946\") " pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:27.988079 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-run-netns\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-var-lib-openvswitch\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988118 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-run-systemd\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-run-openvswitch\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-cni-bin\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-run-netns\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-run-systemd\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-var-lib-openvswitch\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988215 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-run-openvswitch\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:27.988271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-cni-bin\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-cni-netd\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.988293 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988314 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-kubelet\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.988333 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-cni-netd\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988340 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-node-log\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.988364 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs podName:3445bb02-600f-40d2-95c6-d4616e5630c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:28.488344933 +0000 UTC m=+3.092498932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs") pod "network-metrics-daemon-9cw24" (UID: "3445bb02-600f-40d2-95c6-d4616e5630c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:27.988393 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret podName:af564c1a-d59c-4ed2-ae28-bef2ffad4877 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:28.488384302 +0000 UTC m=+3.092538297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret") pod "global-pull-secret-syncer-k99dx" (UID: "af564c1a-d59c-4ed2-ae28-bef2ffad4877") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988394 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-node-log\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-kubelet\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfz9s\" (UniqueName: \"kubernetes.io/projected/3445bb02-600f-40d2-95c6-d4616e5630c5-kube-api-access-lfz9s\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988436 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-etc-openvswitch\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f31d871d-61d4-4d54-a317-0264d5e9f432-ovn-node-metrics-cert\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988511 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-etc-openvswitch\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.988771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988527 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwfrk\" (UniqueName: \"kubernetes.io/projected/f31d871d-61d4-4d54-a317-0264d5e9f432-kube-api-access-gwfrk\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/892816cf-4bb8-447c-8d94-74a892f6bd4c-konnectivity-ca\") pod \"konnectivity-agent-vtzpc\" (UID: \"892816cf-4bb8-447c-8d94-74a892f6bd4c\") " pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f31d871d-61d4-4d54-a317-0264d5e9f432-ovnkube-config\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f31d871d-61d4-4d54-a317-0264d5e9f432-env-overrides\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f31d871d-61d4-4d54-a317-0264d5e9f432-ovnkube-script-lib\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/af564c1a-d59c-4ed2-ae28-bef2ffad4877-kubelet-config\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-run-ovn-kubernetes\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988738 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/892816cf-4bb8-447c-8d94-74a892f6bd4c-agent-certs\") pod \"konnectivity-agent-vtzpc\" (UID: \"892816cf-4bb8-447c-8d94-74a892f6bd4c\") " pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-systemd-units\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-slash\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988807 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-log-socket\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-run-ovn\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.988875 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/af564c1a-d59c-4ed2-ae28-bef2ffad4877-dbus\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.989030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/af564c1a-d59c-4ed2-ae28-bef2ffad4877-dbus\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.989073 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-systemd-units\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.989107 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/892816cf-4bb8-447c-8d94-74a892f6bd4c-konnectivity-ca\") pod \"konnectivity-agent-vtzpc\" (UID: \"892816cf-4bb8-447c-8d94-74a892f6bd4c\") " pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:27.989561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.989166 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-run-ovn\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.990402 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.989111 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-slash\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.990402 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.989182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-log-socket\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.990402 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.989235 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f31d871d-61d4-4d54-a317-0264d5e9f432-host-run-ovn-kubernetes\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.990402 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.989246 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/af564c1a-d59c-4ed2-ae28-bef2ffad4877-kubelet-config\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:27.990402 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.989295 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f31d871d-61d4-4d54-a317-0264d5e9f432-ovnkube-script-lib\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.990402 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.989864 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f31d871d-61d4-4d54-a317-0264d5e9f432-env-overrides\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.990402 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.990346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f31d871d-61d4-4d54-a317-0264d5e9f432-ovnkube-config\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.991371 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.991351 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/892816cf-4bb8-447c-8d94-74a892f6bd4c-agent-certs\") pod \"konnectivity-agent-vtzpc\" (UID: \"892816cf-4bb8-447c-8d94-74a892f6bd4c\") " pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:27.992799 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.992781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f31d871d-61d4-4d54-a317-0264d5e9f432-ovn-node-metrics-cert\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.997399 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.997373 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwfrk\" (UniqueName: \"kubernetes.io/projected/f31d871d-61d4-4d54-a317-0264d5e9f432-kube-api-access-gwfrk\") pod \"ovnkube-node-gwnpr\" (UID: \"f31d871d-61d4-4d54-a317-0264d5e9f432\") " pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:27.997501 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:27.997462 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfz9s\" (UniqueName: \"kubernetes.io/projected/3445bb02-600f-40d2-95c6-d4616e5630c5-kube-api-access-lfz9s\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:28.080813 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.080781 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" Apr 22 18:37:28.090750 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.090723 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" Apr 22 18:37:28.099389 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.099366 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mzkcf" Apr 22 18:37:28.107776 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.107756 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7svrr" Apr 22 18:37:28.115293 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.115277 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f8f9s" Apr 22 18:37:28.121791 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.121775 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nx5jx" Apr 22 18:37:28.127268 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.127252 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9md8j" Apr 22 18:37:28.136213 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.136191 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:28.141746 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.141731 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:28.283683 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.283648 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:28.392004 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.391937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s82jd\" (UniqueName: \"kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd\") pod \"network-check-target-4fnlv\" (UID: \"2bfdae94-a7e1-48cd-8cad-b09cbb620c26\") " pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:28.392155 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:28.392052 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:28.392155 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:28.392071 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:28.392155 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:28.392080 2572 projected.go:194] Error preparing data for projected volume kube-api-access-s82jd for pod openshift-network-diagnostics/network-check-target-4fnlv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:28.392155 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:28.392149 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd podName:2bfdae94-a7e1-48cd-8cad-b09cbb620c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:29.392118621 +0000 UTC m=+3.996272603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s82jd" (UniqueName: "kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd") pod "network-check-target-4fnlv" (UID: "2bfdae94-a7e1-48cd-8cad-b09cbb620c26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:28.492477 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.492448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:28.492477 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.492481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:28.492656 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:28.492590 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:28.492656 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:28.492590 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:28.492656 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:28.492653 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret podName:af564c1a-d59c-4ed2-ae28-bef2ffad4877 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:29.492635508 +0000 UTC m=+4.096789489 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret") pod "global-pull-secret-syncer-k99dx" (UID: "af564c1a-d59c-4ed2-ae28-bef2ffad4877") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:28.492753 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:28.492671 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs podName:3445bb02-600f-40d2-95c6-d4616e5630c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:29.492662657 +0000 UTC m=+4.096816639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs") pod "network-metrics-daemon-9cw24" (UID: "3445bb02-600f-40d2-95c6-d4616e5630c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:28.622647 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:28.622618 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c12a961_b77e_409b_8712_88752c895e8c.slice/crio-7a11ffb0ecd30218cb3de4473362d82cae9eb71ab40ef542a8177d40f4778d1e WatchSource:0}: Error finding container 7a11ffb0ecd30218cb3de4473362d82cae9eb71ab40ef542a8177d40f4778d1e: Status 404 returned error can't find the container with id 7a11ffb0ecd30218cb3de4473362d82cae9eb71ab40ef542a8177d40f4778d1e Apr 22 18:37:28.624061 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:28.624038 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5acb96e2_7093_4837_af37_db4c13f5ea38.slice/crio-a74661aec9e2c8738ebb746d41e19da5ea9da9a5da60a0a7a929391ee2f530d1 WatchSource:0}: Error finding container a74661aec9e2c8738ebb746d41e19da5ea9da9a5da60a0a7a929391ee2f530d1: Status 404 returned error can't find the container with id a74661aec9e2c8738ebb746d41e19da5ea9da9a5da60a0a7a929391ee2f530d1 Apr 22 18:37:28.625213 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:28.625143 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a11437_8f2d_4ead_9089_d300b113a946.slice/crio-baf235866c1fcd43e1eac19264403c629ed05c7a55f478147451ff4044dd90ba WatchSource:0}: Error finding container baf235866c1fcd43e1eac19264403c629ed05c7a55f478147451ff4044dd90ba: Status 404 returned error can't find the container with id baf235866c1fcd43e1eac19264403c629ed05c7a55f478147451ff4044dd90ba Apr 22 18:37:28.628283 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:28.628263 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod892816cf_4bb8_447c_8d94_74a892f6bd4c.slice/crio-850d2062b1cdfea574bd1091b5326bea9b378ce1202b0d7adde0b4d238b4219b WatchSource:0}: Error finding container 850d2062b1cdfea574bd1091b5326bea9b378ce1202b0d7adde0b4d238b4219b: Status 404 returned error can't find the container with id 850d2062b1cdfea574bd1091b5326bea9b378ce1202b0d7adde0b4d238b4219b Apr 22 18:37:28.629221 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:28.629202 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd4872f_6321_4b86_9954_d839a3dff00d.slice/crio-98d05a82c4ac9a0d7926abae304e92e3bcf59e170c7de5570b1c46be53628293 WatchSource:0}: Error finding container 98d05a82c4ac9a0d7926abae304e92e3bcf59e170c7de5570b1c46be53628293: Status 404 returned error can't find the container with id 98d05a82c4ac9a0d7926abae304e92e3bcf59e170c7de5570b1c46be53628293 Apr 22 18:37:28.630473 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:28.630451 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482a486e_af9c_4b33_8137_5fbd716f3fc2.slice/crio-ce7d7a94c21f7666b0869d2f51c6770abe82c2c7efe2748de032f7d7aa7b80f4 WatchSource:0}: Error finding container ce7d7a94c21f7666b0869d2f51c6770abe82c2c7efe2748de032f7d7aa7b80f4: Status 404 returned error can't find the container with id ce7d7a94c21f7666b0869d2f51c6770abe82c2c7efe2748de032f7d7aa7b80f4 Apr 22 18:37:28.630656 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:28.630632 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf31d871d_61d4_4d54_a317_0264d5e9f432.slice/crio-b3da9baf289a6ae503cf14bd419bd2d6ca448be7fafb445477867722ed648402 WatchSource:0}: Error finding container b3da9baf289a6ae503cf14bd419bd2d6ca448be7fafb445477867722ed648402: Status 404 returned error can't find the container with id b3da9baf289a6ae503cf14bd419bd2d6ca448be7fafb445477867722ed648402 Apr 22 18:37:28.631541 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:28.631464 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b646830_7c62_445d_8fa2_df72174ec0c3.slice/crio-43bb3801934623067b28d2ff01b13d674d69d88edf77078762768fa168e436ad WatchSource:0}: Error finding container 43bb3801934623067b28d2ff01b13d674d69d88edf77078762768fa168e436ad: Status 404 returned error can't find the container with id 43bb3801934623067b28d2ff01b13d674d69d88edf77078762768fa168e436ad Apr 22 18:37:28.632729 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:37:28.632634 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd748c1b5_2936_4573_b185_b6e9da61d104.slice/crio-c7b13ea77a691dbfab9ded229bdac149732e1212a79756d64f65a4ce4c000864 WatchSource:0}: Error finding container c7b13ea77a691dbfab9ded229bdac149732e1212a79756d64f65a4ce4c000864: Status 404 returned error can't find the container with id c7b13ea77a691dbfab9ded229bdac149732e1212a79756d64f65a4ce4c000864 Apr 22 18:37:28.819299 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.819144 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:32:26 +0000 UTC" deadline="2027-09-18 23:04:55.427598084 +0000 UTC" Apr 22 18:37:28.819299 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.819295 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12340h27m26.608305599s" Apr 22 18:37:28.871293 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.871262 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nx5jx" event={"ID":"d748c1b5-2936-4573-b185-b6e9da61d104","Type":"ContainerStarted","Data":"c7b13ea77a691dbfab9ded229bdac149732e1212a79756d64f65a4ce4c000864"} Apr 22 18:37:28.872198 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.872178 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vtzpc" event={"ID":"892816cf-4bb8-447c-8d94-74a892f6bd4c","Type":"ContainerStarted","Data":"850d2062b1cdfea574bd1091b5326bea9b378ce1202b0d7adde0b4d238b4219b"} Apr 22 18:37:28.873074 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.873054 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" event={"ID":"5acb96e2-7093-4837-af37-db4c13f5ea38","Type":"ContainerStarted","Data":"a74661aec9e2c8738ebb746d41e19da5ea9da9a5da60a0a7a929391ee2f530d1"} Apr 22 18:37:28.873913 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.873895 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mzkcf" event={"ID":"0c12a961-b77e-409b-8712-88752c895e8c","Type":"ContainerStarted","Data":"7a11ffb0ecd30218cb3de4473362d82cae9eb71ab40ef542a8177d40f4778d1e"} Apr 22 18:37:28.874806 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.874784 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9md8j" event={"ID":"5b646830-7c62-445d-8fa2-df72174ec0c3","Type":"ContainerStarted","Data":"43bb3801934623067b28d2ff01b13d674d69d88edf77078762768fa168e436ad"} Apr 22 18:37:28.875692 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.875672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" event={"ID":"f31d871d-61d4-4d54-a317-0264d5e9f432","Type":"ContainerStarted","Data":"b3da9baf289a6ae503cf14bd419bd2d6ca448be7fafb445477867722ed648402"} Apr 22 18:37:28.877533 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.877513 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7svrr" event={"ID":"482a486e-af9c-4b33-8137-5fbd716f3fc2","Type":"ContainerStarted","Data":"ce7d7a94c21f7666b0869d2f51c6770abe82c2c7efe2748de032f7d7aa7b80f4"} Apr 22 18:37:28.878751 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.878734 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f8f9s" event={"ID":"ecd4872f-6321-4b86-9954-d839a3dff00d","Type":"ContainerStarted","Data":"98d05a82c4ac9a0d7926abae304e92e3bcf59e170c7de5570b1c46be53628293"} Apr 22 18:37:28.880324 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.880294 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" event={"ID":"02a11437-8f2d-4ead-9089-d300b113a946","Type":"ContainerStarted","Data":"baf235866c1fcd43e1eac19264403c629ed05c7a55f478147451ff4044dd90ba"} Apr 22 18:37:28.884987 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.884964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal" event={"ID":"172d194e4a77f6eb5607ce1c62ccac15","Type":"ContainerStarted","Data":"b86027ff604203d4b9dc4b1d4f2279e68170d7b62177d4e748e6d5aee3acd23c"} Apr 22 18:37:28.900724 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:28.900648 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-26.ec2.internal" podStartSLOduration=1.9006339749999999 podStartE2EDuration="1.900633975s" podCreationTimestamp="2026-04-22 18:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:37:28.900011496 +0000 UTC m=+3.504165501" watchObservedRunningTime="2026-04-22 18:37:28.900633975 +0000 UTC m=+3.504787985" Apr 22 18:37:29.398498 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:29.398463 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s82jd\" (UniqueName: \"kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd\") pod \"network-check-target-4fnlv\" (UID: \"2bfdae94-a7e1-48cd-8cad-b09cbb620c26\") " pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:29.398695 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.398677 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:29.398766 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.398701 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:29.398766 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.398715 2572 projected.go:194] Error preparing data for projected volume kube-api-access-s82jd for pod openshift-network-diagnostics/network-check-target-4fnlv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:29.398863 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.398775 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd podName:2bfdae94-a7e1-48cd-8cad-b09cbb620c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:31.398756884 +0000 UTC m=+6.002910867 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s82jd" (UniqueName: "kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd") pod "network-check-target-4fnlv" (UID: "2bfdae94-a7e1-48cd-8cad-b09cbb620c26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:29.500167 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:29.500111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:29.500346 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:29.500183 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:29.500408 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.500358 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:29.500465 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.500418 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret podName:af564c1a-d59c-4ed2-ae28-bef2ffad4877 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:31.50040157 +0000 UTC m=+6.104555558 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret") pod "global-pull-secret-syncer-k99dx" (UID: "af564c1a-d59c-4ed2-ae28-bef2ffad4877") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:29.500808 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.500788 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:29.500891 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.500842 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs podName:3445bb02-600f-40d2-95c6-d4616e5630c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:31.500827057 +0000 UTC m=+6.104981040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs") pod "network-metrics-daemon-9cw24" (UID: "3445bb02-600f-40d2-95c6-d4616e5630c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:29.866155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:29.864436 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:29.866155 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.864570 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:29.866155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:29.864974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:29.866155 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.865052 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:29.866155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:29.865165 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:29.866155 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:29.865250 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:29.906154 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:29.905395 2572 generic.go:358] "Generic (PLEG): container finished" podID="32c72d2e359afc7dfc63c772815179d7" containerID="77c115ac6878b1c6e73936c2b22580812b56884a116393c9393668f75e2eab5a" exitCode=0 Apr 22 18:37:29.906154 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:29.905909 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" event={"ID":"32c72d2e359afc7dfc63c772815179d7","Type":"ContainerDied","Data":"77c115ac6878b1c6e73936c2b22580812b56884a116393c9393668f75e2eab5a"} Apr 22 18:37:30.924908 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:30.924793 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" event={"ID":"32c72d2e359afc7dfc63c772815179d7","Type":"ContainerStarted","Data":"4477dacbb19effc8c92bf4399a9018b8380fbbce987902b5af13b0d18d727f10"} Apr 22 18:37:31.415419 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:31.415374 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s82jd\" (UniqueName: \"kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd\") pod \"network-check-target-4fnlv\" (UID: \"2bfdae94-a7e1-48cd-8cad-b09cbb620c26\") " pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:31.415610 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.415590 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:31.415676 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.415618 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:31.415676 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.415630 2572 projected.go:194] Error preparing data for projected volume kube-api-access-s82jd for pod openshift-network-diagnostics/network-check-target-4fnlv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:31.415792 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.415687 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd podName:2bfdae94-a7e1-48cd-8cad-b09cbb620c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:35.415668904 +0000 UTC m=+10.019822900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s82jd" (UniqueName: "kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd") pod "network-check-target-4fnlv" (UID: "2bfdae94-a7e1-48cd-8cad-b09cbb620c26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:31.516708 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:31.516646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:31.516708 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:31.516691 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:31.516906 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.516812 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:31.516906 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.516868 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret podName:af564c1a-d59c-4ed2-ae28-bef2ffad4877 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:35.516850587 +0000 UTC m=+10.121004575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret") pod "global-pull-secret-syncer-k99dx" (UID: "af564c1a-d59c-4ed2-ae28-bef2ffad4877") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:31.517280 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.517261 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:31.517374 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.517318 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs podName:3445bb02-600f-40d2-95c6-d4616e5630c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:35.517303166 +0000 UTC m=+10.121457153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs") pod "network-metrics-daemon-9cw24" (UID: "3445bb02-600f-40d2-95c6-d4616e5630c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:31.864284 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:31.863608 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:31.864284 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.863725 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:31.864284 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:31.864113 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:31.864284 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.864240 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:31.864831 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:31.864687 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:31.864831 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:31.864792 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:33.865353 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:33.864597 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:33.865353 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:33.864722 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:33.865353 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:33.865182 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:33.865353 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:33.865292 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:33.865954 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:33.865814 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:33.865954 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:33.865905 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:35.449902 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:35.449863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s82jd\" (UniqueName: \"kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd\") pod \"network-check-target-4fnlv\" (UID: \"2bfdae94-a7e1-48cd-8cad-b09cbb620c26\") " pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:35.450300 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.450037 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:35.450300 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.450062 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:35.450300 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.450075 2572 projected.go:194] Error preparing data for projected volume kube-api-access-s82jd for pod openshift-network-diagnostics/network-check-target-4fnlv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:35.450300 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.450153 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd podName:2bfdae94-a7e1-48cd-8cad-b09cbb620c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:43.450118821 +0000 UTC m=+18.054272819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s82jd" (UniqueName: "kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd") pod "network-check-target-4fnlv" (UID: "2bfdae94-a7e1-48cd-8cad-b09cbb620c26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:35.550885 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:35.550169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:35.550885 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:35.550216 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:35.550885 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.550381 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:35.550885 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.550435 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret podName:af564c1a-d59c-4ed2-ae28-bef2ffad4877 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:43.550421885 +0000 UTC m=+18.154575867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret") pod "global-pull-secret-syncer-k99dx" (UID: "af564c1a-d59c-4ed2-ae28-bef2ffad4877") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:35.550885 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.550799 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:35.550885 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.550850 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs podName:3445bb02-600f-40d2-95c6-d4616e5630c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:43.550835172 +0000 UTC m=+18.154989155 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs") pod "network-metrics-daemon-9cw24" (UID: "3445bb02-600f-40d2-95c6-d4616e5630c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:35.867438 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:35.866672 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:35.867438 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.866779 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:35.867438 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:35.867162 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:35.867438 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.867261 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:35.867438 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:35.867311 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:35.867438 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:35.867377 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:37.864174 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:37.864119 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:37.864174 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:37.864170 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:37.864703 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:37.864272 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:37.864703 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:37.864337 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:37.864703 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:37.864403 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:37.864703 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:37.864492 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:39.863713 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:39.863677 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:39.863713 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:39.863684 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:39.864215 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:39.863695 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:39.864215 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:39.863808 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:39.864215 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:39.863893 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:39.864215 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:39.863975 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:41.863540 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:41.863507 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:41.863920 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:41.863507 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:41.863920 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:41.863631 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:41.863920 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:41.863730 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:41.863920 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:41.863515 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:41.863920 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:41.863835 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:43.515143 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:43.515099 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s82jd\" (UniqueName: \"kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd\") pod \"network-check-target-4fnlv\" (UID: \"2bfdae94-a7e1-48cd-8cad-b09cbb620c26\") " pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:43.515560 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.515242 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:43.515560 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.515260 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:43.515560 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.515269 2572 projected.go:194] Error preparing data for projected volume kube-api-access-s82jd for pod openshift-network-diagnostics/network-check-target-4fnlv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:43.515560 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.515321 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd podName:2bfdae94-a7e1-48cd-8cad-b09cbb620c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:59.515305178 +0000 UTC m=+34.119459162 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s82jd" (UniqueName: "kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd") pod "network-check-target-4fnlv" (UID: "2bfdae94-a7e1-48cd-8cad-b09cbb620c26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:43.615740 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:43.615704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:43.615903 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:43.615814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:43.615903 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.615866 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:43.615994 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.615903 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:43.615994 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.615940 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret podName:af564c1a-d59c-4ed2-ae28-bef2ffad4877 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:59.615918522 +0000 UTC m=+34.220072509 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret") pod "global-pull-secret-syncer-k99dx" (UID: "af564c1a-d59c-4ed2-ae28-bef2ffad4877") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:43.615994 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.615962 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs podName:3445bb02-600f-40d2-95c6-d4616e5630c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:59.615951276 +0000 UTC m=+34.220105266 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs") pod "network-metrics-daemon-9cw24" (UID: "3445bb02-600f-40d2-95c6-d4616e5630c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:43.864062 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:43.864028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:43.864234 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:43.864034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:43.864234 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.864155 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:43.864354 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:43.864043 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:43.864354 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.864229 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:43.864458 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:43.864345 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:45.864922 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:45.864883 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:45.865393 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:45.864986 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:45.865393 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:45.865075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:45.865393 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:45.865217 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:45.865393 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:45.865238 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:45.865393 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:45.865308 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:46.954840 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.954495 2572 generic.go:358] "Generic (PLEG): container finished" podID="5b646830-7c62-445d-8fa2-df72174ec0c3" containerID="0b83d82f8f03292d688f726ff1b7489014e493ea18fb3c7845bf3b68e02f68f8" exitCode=0 Apr 22 18:37:46.955671 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.954543 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9md8j" event={"ID":"5b646830-7c62-445d-8fa2-df72174ec0c3","Type":"ContainerDied","Data":"0b83d82f8f03292d688f726ff1b7489014e493ea18fb3c7845bf3b68e02f68f8"} Apr 22 18:37:46.957820 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.957797 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:37:46.958180 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.958156 2572 generic.go:358] "Generic (PLEG): container finished" podID="f31d871d-61d4-4d54-a317-0264d5e9f432" containerID="e243ced57331a4677d91041c57d0eb3f2645c4263fbbc3c4a0532fd3a0ac651f" exitCode=1 Apr 22 18:37:46.958289 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.958221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" event={"ID":"f31d871d-61d4-4d54-a317-0264d5e9f432","Type":"ContainerStarted","Data":"61930deb3e5619df3f0f35a413535cc5b78e6bae210811dcda0bada193a2c5c5"} Apr 22 18:37:46.958289 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.958244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" event={"ID":"f31d871d-61d4-4d54-a317-0264d5e9f432","Type":"ContainerStarted","Data":"24a70227a3da6e86b7d218ab90bdbe9162f19ff6fc77126172e5cf5ec6e612db"} Apr 22 18:37:46.958289 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.958268 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" event={"ID":"f31d871d-61d4-4d54-a317-0264d5e9f432","Type":"ContainerStarted","Data":"5f41fbdc86b7028861afaae89b82448f2efcf5bf50d44c44b2d8fc658ee3a322"} Apr 22 18:37:46.958289 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.958282 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" event={"ID":"f31d871d-61d4-4d54-a317-0264d5e9f432","Type":"ContainerStarted","Data":"e7b2c1d4415e79b63c6819ae42d54072c4cd318e0b355d1df349dd8e25a5515d"} Apr 22 18:37:46.958525 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.958295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" event={"ID":"f31d871d-61d4-4d54-a317-0264d5e9f432","Type":"ContainerDied","Data":"e243ced57331a4677d91041c57d0eb3f2645c4263fbbc3c4a0532fd3a0ac651f"} Apr 22 18:37:46.958525 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.958309 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" event={"ID":"f31d871d-61d4-4d54-a317-0264d5e9f432","Type":"ContainerStarted","Data":"a479c4875a41459cc9c63d98232cf832a9de6e0b23c575f685f1cd46c34a025f"} Apr 22 18:37:46.959508 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.959488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7svrr" event={"ID":"482a486e-af9c-4b33-8137-5fbd716f3fc2","Type":"ContainerStarted","Data":"aa15ff92d3b56d031a9a18147ad38f910260dc16a3d9e41d462ecdd7c2c6f1f9"} Apr 22 18:37:46.961107 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.961084 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f8f9s" event={"ID":"ecd4872f-6321-4b86-9954-d839a3dff00d","Type":"ContainerStarted","Data":"ae6bfa3b38844aef10447a56b3ffec9586ad9735180b9a410aa5a8ca02a8b260"} Apr 22 18:37:46.962675 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.962651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" event={"ID":"02a11437-8f2d-4ead-9089-d300b113a946","Type":"ContainerStarted","Data":"ba0949cd1f0424a69e7e4927115b31df2f649d528dc3cef22bf4ad1f83c8a4e5"} Apr 22 18:37:46.963880 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.963845 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nx5jx" event={"ID":"d748c1b5-2936-4573-b185-b6e9da61d104","Type":"ContainerStarted","Data":"c49b8e55ff4dbd44066ddbc7eaea7e8681831f1d976cdd5342892cb9ba03d162"} Apr 22 18:37:46.965220 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.965188 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vtzpc" event={"ID":"892816cf-4bb8-447c-8d94-74a892f6bd4c","Type":"ContainerStarted","Data":"140431eefb131a9b143b772844c92e95340bacdf30d9d8ece7c4c2e87adca3e8"} Apr 22 18:37:46.966515 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.966492 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" event={"ID":"5acb96e2-7093-4837-af37-db4c13f5ea38","Type":"ContainerStarted","Data":"a8460c914a61d2f0477cd0bef69dd2d0c49360240e06f927835a25aa1f054ce5"} Apr 22 18:37:46.977651 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.977611 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-26.ec2.internal" podStartSLOduration=19.977600796 podStartE2EDuration="19.977600796s" podCreationTimestamp="2026-04-22 18:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:37:30.942070333 +0000 UTC m=+5.546224336" watchObservedRunningTime="2026-04-22 18:37:46.977600796 +0000 UTC m=+21.581754801" Apr 22 18:37:46.990659 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:46.990613 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7svrr" podStartSLOduration=3.594834518 podStartE2EDuration="20.990603192s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="2026-04-22 18:37:28.631948261 +0000 UTC m=+3.236102247" lastFinishedPulling="2026-04-22 18:37:46.027716936 +0000 UTC m=+20.631870921" observedRunningTime="2026-04-22 18:37:46.989995603 +0000 UTC m=+21.594149607" watchObservedRunningTime="2026-04-22 18:37:46.990603192 +0000 UTC m=+21.594757319" Apr 22 18:37:47.003915 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.003887 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nx5jx" podStartSLOduration=3.614442852 podStartE2EDuration="21.003879418s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="2026-04-22 18:37:28.634213633 +0000 UTC m=+3.238367632" lastFinishedPulling="2026-04-22 18:37:46.023650202 +0000 UTC m=+20.627804198" observedRunningTime="2026-04-22 18:37:47.003714705 +0000 UTC m=+21.607868708" watchObservedRunningTime="2026-04-22 18:37:47.003879418 +0000 UTC m=+21.608033422" Apr 22 18:37:47.039316 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.039282 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ksw4k" podStartSLOduration=4.630595037 podStartE2EDuration="22.039272399s" podCreationTimestamp="2026-04-22 18:37:25 +0000 UTC" firstStartedPulling="2026-04-22 18:37:28.627281286 +0000 UTC m=+3.231435267" lastFinishedPulling="2026-04-22 18:37:46.035958628 +0000 UTC m=+20.640112629" observedRunningTime="2026-04-22 18:37:47.03893148 +0000 UTC m=+21.643085484" watchObservedRunningTime="2026-04-22 18:37:47.039272399 +0000 UTC m=+21.643426400" Apr 22 18:37:47.081156 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.081096 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f8f9s" podStartSLOduration=3.637866948 podStartE2EDuration="21.081086339s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="2026-04-22 18:37:28.631757432 +0000 UTC m=+3.235911428" lastFinishedPulling="2026-04-22 18:37:46.07497683 +0000 UTC m=+20.679130819" observedRunningTime="2026-04-22 18:37:47.060464417 +0000 UTC m=+21.664618417" watchObservedRunningTime="2026-04-22 18:37:47.081086339 +0000 UTC m=+21.685240339" Apr 22 18:37:47.081616 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.081593 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vtzpc" podStartSLOduration=3.689145668 podStartE2EDuration="21.081588573s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="2026-04-22 18:37:28.630772114 +0000 UTC m=+3.234926097" lastFinishedPulling="2026-04-22 18:37:46.023215006 +0000 UTC m=+20.627369002" observedRunningTime="2026-04-22 18:37:47.081350889 +0000 UTC m=+21.685504904" watchObservedRunningTime="2026-04-22 18:37:47.081588573 +0000 UTC m=+21.685742577" Apr 22 18:37:47.422597 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.422352 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:37:47.841979 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.841862 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:37:47.422366847Z","UUID":"e95bdda5-3ab9-4290-88a1-9c695b27519e","Handler":null,"Name":"","Endpoint":""} Apr 22 18:37:47.843519 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.843492 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:37:47.843658 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.843538 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:37:47.863604 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.863578 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:47.863604 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.863598 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:47.863809 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.863653 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:47.863809 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:47.863753 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:47.863913 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:47.863892 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:47.863986 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:47.863967 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:47.970026 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.969991 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mzkcf" event={"ID":"0c12a961-b77e-409b-8712-88752c895e8c","Type":"ContainerStarted","Data":"4b0c7a6f9de72e0aa568079fb006b99c1499a11e0f2879a5eff97b9281175f22"} Apr 22 18:37:47.972038 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:47.972015 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" event={"ID":"5acb96e2-7093-4837-af37-db4c13f5ea38","Type":"ContainerStarted","Data":"189d048be45c4dc2c02e6013236f56dd2a04bb9079451a1fdc6f6954bd346b2c"} Apr 22 18:37:48.975650 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:48.975367 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" event={"ID":"5acb96e2-7093-4837-af37-db4c13f5ea38","Type":"ContainerStarted","Data":"93ad1db501e46f4cde3972185771983e8608f31a36e1f78b3537051fec905650"} Apr 22 18:37:48.978753 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:48.978726 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:37:48.979190 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:48.979112 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" event={"ID":"f31d871d-61d4-4d54-a317-0264d5e9f432","Type":"ContainerStarted","Data":"11ca7eb4bc2739ef46b931f9826a350cdb6f42752dae4056bb69a70232390c1f"} Apr 22 18:37:48.995947 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:48.995907 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mzkcf" podStartSLOduration=5.596748741 podStartE2EDuration="22.995895447s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="2026-04-22 18:37:28.624105372 +0000 UTC m=+3.228259357" lastFinishedPulling="2026-04-22 18:37:46.023252076 +0000 UTC m=+20.627406063" observedRunningTime="2026-04-22 18:37:47.98553908 +0000 UTC m=+22.589693085" watchObservedRunningTime="2026-04-22 18:37:48.995895447 +0000 UTC m=+23.600049450" Apr 22 18:37:48.996085 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:48.996006 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4rmq" podStartSLOduration=3.923487255 podStartE2EDuration="23.995999718s" podCreationTimestamp="2026-04-22 18:37:25 +0000 UTC" firstStartedPulling="2026-04-22 18:37:28.626191727 +0000 UTC m=+3.230345709" lastFinishedPulling="2026-04-22 18:37:48.698704187 +0000 UTC m=+23.302858172" observedRunningTime="2026-04-22 18:37:48.995585029 +0000 UTC m=+23.599739034" watchObservedRunningTime="2026-04-22 18:37:48.995999718 +0000 UTC m=+23.600153724" Apr 22 18:37:49.863536 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:49.863502 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:49.863536 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:49.863518 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:49.863536 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:49.863534 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:49.863800 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:49.863601 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:49.863800 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:49.863689 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:49.863800 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:49.863770 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:50.708612 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:50.708584 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:50.709239 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:50.709221 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:50.983398 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:50.983323 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:50.983874 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:50.983854 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vtzpc" Apr 22 18:37:51.864417 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:51.864253 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:51.864969 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:51.864260 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:51.864969 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:51.864260 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:51.864969 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:51.864628 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:51.864969 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:51.864491 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:51.864969 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:51.864706 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:51.986330 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:51.986306 2572 generic.go:358] "Generic (PLEG): container finished" podID="5b646830-7c62-445d-8fa2-df72174ec0c3" containerID="b60ee8c80e6829aae56edae94565a5a2f836e6e306faf984bfccddb0fffaeebd" exitCode=0 Apr 22 18:37:51.986432 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:51.986381 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9md8j" event={"ID":"5b646830-7c62-445d-8fa2-df72174ec0c3","Type":"ContainerDied","Data":"b60ee8c80e6829aae56edae94565a5a2f836e6e306faf984bfccddb0fffaeebd"} Apr 22 18:37:51.989570 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:51.989548 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:37:51.989954 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:51.989931 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" event={"ID":"f31d871d-61d4-4d54-a317-0264d5e9f432","Type":"ContainerStarted","Data":"ec5d11cad0932b3a657fe4ebe36f24eb6dae93fa8b3bc04d532145a89f6bd933"} Apr 22 18:37:51.990154 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:51.990116 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:51.990287 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:51.990272 2572 scope.go:117] "RemoveContainer" containerID="e243ced57331a4677d91041c57d0eb3f2645c4263fbbc3c4a0532fd3a0ac651f" Apr 22 18:37:52.004720 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:52.004707 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:52.700752 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:52.700725 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:52.994641 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:52.994622 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:37:52.994971 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:52.994945 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" event={"ID":"f31d871d-61d4-4d54-a317-0264d5e9f432","Type":"ContainerStarted","Data":"4983cce55da8a61e89c290eec474cf851eb78d6b49195f1a6b470e7325d6ab76"} Apr 22 18:37:52.995218 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:52.995198 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:53.012321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:53.012302 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:37:53.027663 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:53.027623 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" podStartSLOduration=9.568356902 podStartE2EDuration="27.027608163s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="2026-04-22 18:37:28.63270116 +0000 UTC m=+3.236855147" lastFinishedPulling="2026-04-22 18:37:46.091952411 +0000 UTC m=+20.696106408" observedRunningTime="2026-04-22 18:37:53.027030492 +0000 UTC m=+27.631184503" watchObservedRunningTime="2026-04-22 18:37:53.027608163 +0000 UTC m=+27.631762169" Apr 22 18:37:53.081537 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:53.081515 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9cw24"] Apr 22 18:37:53.081628 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:53.081618 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:53.081724 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:53.081704 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:53.083972 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:53.083951 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k99dx"] Apr 22 18:37:53.084065 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:53.084041 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:53.084149 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:53.084110 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:53.096529 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:53.096505 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4fnlv"] Apr 22 18:37:53.096614 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:53.096602 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:53.096689 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:53.096669 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:53.998703 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:53.998660 2572 generic.go:358] "Generic (PLEG): container finished" podID="5b646830-7c62-445d-8fa2-df72174ec0c3" containerID="dc000c4d7f9242715a42ee755c10e5bdfb692dfcbf005ec7488b0de49c61bc11" exitCode=0 Apr 22 18:37:53.999242 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:53.998743 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9md8j" event={"ID":"5b646830-7c62-445d-8fa2-df72174ec0c3","Type":"ContainerDied","Data":"dc000c4d7f9242715a42ee755c10e5bdfb692dfcbf005ec7488b0de49c61bc11"} Apr 22 18:37:54.864019 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:54.863985 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:54.864019 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:54.864028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:54.864188 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:54.864049 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:54.864188 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:54.864147 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:54.864270 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:54.864223 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:54.864325 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:54.864307 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:55.004344 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:55.004311 2572 generic.go:358] "Generic (PLEG): container finished" podID="5b646830-7c62-445d-8fa2-df72174ec0c3" containerID="225a3f734c8d30bec32a600775c34b169ca2e71d2149419125c1a6c0fb353feb" exitCode=0 Apr 22 18:37:55.004736 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:55.004410 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9md8j" event={"ID":"5b646830-7c62-445d-8fa2-df72174ec0c3","Type":"ContainerDied","Data":"225a3f734c8d30bec32a600775c34b169ca2e71d2149419125c1a6c0fb353feb"} Apr 22 18:37:56.864175 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:56.863983 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:56.864656 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:56.864032 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:56.864656 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:56.864258 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:56.864656 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:56.864039 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:56.864656 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:56.864353 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:56.864656 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:56.864416 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:58.863642 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:58.863606 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:58.863642 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:58.863625 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:58.864058 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:58.863734 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k99dx" podUID="af564c1a-d59c-4ed2-ae28-bef2ffad4877" Apr 22 18:37:58.864058 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:58.863822 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4fnlv" podUID="2bfdae94-a7e1-48cd-8cad-b09cbb620c26" Apr 22 18:37:58.864058 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:58.863856 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:58.864058 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:58.863948 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9cw24" podUID="3445bb02-600f-40d2-95c6-d4616e5630c5" Apr 22 18:37:59.274343 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.274309 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-26.ec2.internal" event="NodeReady" Apr 22 18:37:59.274605 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.274466 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:37:59.308878 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.308850 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-689895f57b-dkjvc"] Apr 22 18:37:59.341456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.341428 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-689895f57b-dkjvc"] Apr 22 18:37:59.341456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.341460 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-45l64"] Apr 22 18:37:59.341662 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.341594 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.345622 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.344440 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:37:59.345622 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.344695 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:37:59.348420 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.348393 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fbmmk\"" Apr 22 18:37:59.348894 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.348875 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:37:59.350525 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.350504 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:37:59.360370 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.360350 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-srk76"] Apr 22 18:37:59.360508 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.360493 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.362865 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.362837 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cfw2h\"" Apr 22 18:37:59.362966 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.362875 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:37:59.363027 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.363010 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:37:59.378332 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.378314 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-srk76"] Apr 22 18:37:59.378487 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.378338 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-45l64"] Apr 22 18:37:59.378487 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.378430 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:37:59.380698 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.380678 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rz76n\"" Apr 22 18:37:59.380854 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.380741 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:37:59.380986 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.380968 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:37:59.381044 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.381032 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:37:59.440771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.440743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-certificates\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.440771 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.440774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-installation-pull-secrets\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.441011 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.440798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-bound-sa-token\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.441011 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.440859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgj2t\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-kube-api-access-jgj2t\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.441011 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.440912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.441011 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.440961 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51c59af5-e091-4da9-bd46-b4c3f52ba283-ca-trust-extracted\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.441011 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.440987 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-trusted-ca\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.441290 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.441015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-image-registry-private-configuration\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.542186 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542092 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s82jd\" (UniqueName: \"kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd\") pod \"network-check-target-4fnlv\" (UID: \"2bfdae94-a7e1-48cd-8cad-b09cbb620c26\") " pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:37:59.542186 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.542186 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542177 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51c59af5-e091-4da9-bd46-b4c3f52ba283-ca-trust-extracted\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.542378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542195 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8628c438-3340-49c9-bbba-e74cdc0b5926-config-volume\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.542378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-trusted-ca\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.542378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542239 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-image-registry-private-configuration\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.542378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542276 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:37:59.542378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-certificates\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.542378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542312 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-installation-pull-secrets\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.542378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-bound-sa-token\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.542378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542345 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2scm\" (UniqueName: \"kubernetes.io/projected/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-kube-api-access-j2scm\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:37:59.542378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542360 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.542378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4c7t\" (UniqueName: \"kubernetes.io/projected/8628c438-3340-49c9-bbba-e74cdc0b5926-kube-api-access-d4c7t\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.542663 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgj2t\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-kube-api-access-jgj2t\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.542663 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.542416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8628c438-3340-49c9-bbba-e74cdc0b5926-tmp-dir\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.542663 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.542535 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:59.542663 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.542547 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:59.542663 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.542555 2572 projected.go:194] Error preparing data for projected volume kube-api-access-s82jd for pod openshift-network-diagnostics/network-check-target-4fnlv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:59.542663 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.542594 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd podName:2bfdae94-a7e1-48cd-8cad-b09cbb620c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:31.542581621 +0000 UTC m=+66.146735603 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s82jd" (UniqueName: "kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd") pod "network-check-target-4fnlv" (UID: "2bfdae94-a7e1-48cd-8cad-b09cbb620c26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:59.542975 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.542947 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:37:59.542975 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.542970 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-689895f57b-dkjvc: secret "image-registry-tls" not found Apr 22 18:37:59.543180 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.543016 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls podName:51c59af5-e091-4da9-bd46-b4c3f52ba283 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:00.04300032 +0000 UTC m=+34.647154303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls") pod "image-registry-689895f57b-dkjvc" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283") : secret "image-registry-tls" not found Apr 22 18:37:59.543605 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.543578 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51c59af5-e091-4da9-bd46-b4c3f52ba283-ca-trust-extracted\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.543846 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.543823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-certificates\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.548098 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.548077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-image-registry-private-configuration\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.548200 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.548103 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-installation-pull-secrets\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.554261 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.554228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-trusted-ca\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.555059 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.555040 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-bound-sa-token\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.555206 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.555188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgj2t\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-kube-api-access-jgj2t\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:37:59.643270 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.643240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8628c438-3340-49c9-bbba-e74cdc0b5926-tmp-dir\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.643406 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.643313 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8628c438-3340-49c9-bbba-e74cdc0b5926-config-volume\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.643406 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.643376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:37:59.643406 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.643399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:37:59.643553 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.643414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:37:59.643553 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.643437 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2scm\" (UniqueName: \"kubernetes.io/projected/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-kube-api-access-j2scm\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:37:59.643553 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.643459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.643553 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.643485 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4c7t\" (UniqueName: \"kubernetes.io/projected/8628c438-3340-49c9-bbba-e74cdc0b5926-kube-api-access-d4c7t\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.643733 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.643549 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:59.643733 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.643580 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8628c438-3340-49c9-bbba-e74cdc0b5926-tmp-dir\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.643733 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.643613 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret podName:af564c1a-d59c-4ed2-ae28-bef2ffad4877 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:31.643595643 +0000 UTC m=+66.247749650 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret") pod "global-pull-secret-syncer-k99dx" (UID: "af564c1a-d59c-4ed2-ae28-bef2ffad4877") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:59.643733 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.643666 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:59.643733 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.643711 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs podName:3445bb02-600f-40d2-95c6-d4616e5630c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:31.643696131 +0000 UTC m=+66.247850118 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs") pod "network-metrics-daemon-9cw24" (UID: "3445bb02-600f-40d2-95c6-d4616e5630c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:59.643978 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.643834 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:37:59.643978 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.643842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8628c438-3340-49c9-bbba-e74cdc0b5926-config-volume\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.643978 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.643876 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls podName:8628c438-3340-49c9-bbba-e74cdc0b5926 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:00.143863101 +0000 UTC m=+34.748017084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls") pod "dns-default-45l64" (UID: "8628c438-3340-49c9-bbba-e74cdc0b5926") : secret "dns-default-metrics-tls" not found Apr 22 18:37:59.643978 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.643919 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:37:59.643978 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:37:59.643971 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert podName:700dd0a8-270f-4acc-bbc0-dc7d43241d5f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:00.143948171 +0000 UTC m=+34.748102157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert") pod "ingress-canary-srk76" (UID: "700dd0a8-270f-4acc-bbc0-dc7d43241d5f") : secret "canary-serving-cert" not found Apr 22 18:37:59.652569 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.652538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4c7t\" (UniqueName: \"kubernetes.io/projected/8628c438-3340-49c9-bbba-e74cdc0b5926-kube-api-access-d4c7t\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:37:59.652664 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:37:59.652603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2scm\" (UniqueName: \"kubernetes.io/projected/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-kube-api-access-j2scm\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:38:00.045955 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.045913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:38:00.046704 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:00.046071 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:00.046704 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:00.046087 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-689895f57b-dkjvc: secret "image-registry-tls" not found Apr 22 18:38:00.046704 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:00.046169 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls podName:51c59af5-e091-4da9-bd46-b4c3f52ba283 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:01.046151079 +0000 UTC m=+35.650305062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls") pod "image-registry-689895f57b-dkjvc" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283") : secret "image-registry-tls" not found Apr 22 18:38:00.147089 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.147042 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:38:00.147089 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.147093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:38:00.147323 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:00.147214 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:00.147323 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:00.147215 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:00.147323 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:00.147272 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert podName:700dd0a8-270f-4acc-bbc0-dc7d43241d5f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:01.147256711 +0000 UTC m=+35.751410696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert") pod "ingress-canary-srk76" (UID: "700dd0a8-270f-4acc-bbc0-dc7d43241d5f") : secret "canary-serving-cert" not found Apr 22 18:38:00.147323 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:00.147290 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls podName:8628c438-3340-49c9-bbba-e74cdc0b5926 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:01.147280361 +0000 UTC m=+35.751434347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls") pod "dns-default-45l64" (UID: "8628c438-3340-49c9-bbba-e74cdc0b5926") : secret "dns-default-metrics-tls" not found Apr 22 18:38:00.863815 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.863783 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:38:00.863815 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.863813 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:38:00.864007 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.863979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:38:00.867596 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.867567 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:38:00.867596 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.867581 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:38:00.867784 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.867607 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-stzzq\"" Apr 22 18:38:00.867784 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.867608 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:38:00.867784 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.867674 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-djgkc\"" Apr 22 18:38:00.867902 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:00.867845 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:38:01.054922 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:01.054897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:38:01.055230 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:01.055020 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:01.055230 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:01.055032 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-689895f57b-dkjvc: secret "image-registry-tls" not found Apr 22 18:38:01.055230 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:01.055078 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls podName:51c59af5-e091-4da9-bd46-b4c3f52ba283 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:03.05506468 +0000 UTC m=+37.659218666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls") pod "image-registry-689895f57b-dkjvc" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283") : secret "image-registry-tls" not found Apr 22 18:38:01.155423 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:01.155401 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:38:01.155498 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:01.155444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:38:01.155552 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:01.155540 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:01.155603 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:01.155580 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:01.155654 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:01.155603 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert podName:700dd0a8-270f-4acc-bbc0-dc7d43241d5f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:03.155584591 +0000 UTC m=+37.759738575 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert") pod "ingress-canary-srk76" (UID: "700dd0a8-270f-4acc-bbc0-dc7d43241d5f") : secret "canary-serving-cert" not found Apr 22 18:38:01.155654 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:01.155626 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls podName:8628c438-3340-49c9-bbba-e74cdc0b5926 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:03.155611839 +0000 UTC m=+37.759765824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls") pod "dns-default-45l64" (UID: "8628c438-3340-49c9-bbba-e74cdc0b5926") : secret "dns-default-metrics-tls" not found Apr 22 18:38:02.019668 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:02.019504 2572 generic.go:358] "Generic (PLEG): container finished" podID="5b646830-7c62-445d-8fa2-df72174ec0c3" containerID="639f847e8026e86b1be60d9564ba1cb2c5d00209628eeb5263f3583b9cf85a00" exitCode=0 Apr 22 18:38:02.019668 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:02.019547 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9md8j" event={"ID":"5b646830-7c62-445d-8fa2-df72174ec0c3","Type":"ContainerDied","Data":"639f847e8026e86b1be60d9564ba1cb2c5d00209628eeb5263f3583b9cf85a00"} Apr 22 18:38:03.024478 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:03.024402 2572 generic.go:358] "Generic (PLEG): container finished" podID="5b646830-7c62-445d-8fa2-df72174ec0c3" containerID="9535c38f4820d077c08476875a990ab12109e50c33e8172b8f5ad60a3732f48d" exitCode=0 Apr 22 18:38:03.024478 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:03.024468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9md8j" event={"ID":"5b646830-7c62-445d-8fa2-df72174ec0c3","Type":"ContainerDied","Data":"9535c38f4820d077c08476875a990ab12109e50c33e8172b8f5ad60a3732f48d"} Apr 22 18:38:03.069759 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:03.069695 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:38:03.069874 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:03.069815 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:03.069874 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:03.069833 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-689895f57b-dkjvc: secret "image-registry-tls" not found Apr 22 18:38:03.069969 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:03.069888 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls podName:51c59af5-e091-4da9-bd46-b4c3f52ba283 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:07.069871216 +0000 UTC m=+41.674025199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls") pod "image-registry-689895f57b-dkjvc" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283") : secret "image-registry-tls" not found Apr 22 18:38:03.170059 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:03.170029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:38:03.170188 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:03.170110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:38:03.170188 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:03.170145 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:03.170313 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:03.170211 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert podName:700dd0a8-270f-4acc-bbc0-dc7d43241d5f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:07.170194878 +0000 UTC m=+41.774348880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert") pod "ingress-canary-srk76" (UID: "700dd0a8-270f-4acc-bbc0-dc7d43241d5f") : secret "canary-serving-cert" not found Apr 22 18:38:03.170313 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:03.170247 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:03.170422 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:03.170346 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls podName:8628c438-3340-49c9-bbba-e74cdc0b5926 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:07.1703289 +0000 UTC m=+41.774482892 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls") pod "dns-default-45l64" (UID: "8628c438-3340-49c9-bbba-e74cdc0b5926") : secret "dns-default-metrics-tls" not found Apr 22 18:38:04.029315 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:04.029275 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9md8j" event={"ID":"5b646830-7c62-445d-8fa2-df72174ec0c3","Type":"ContainerStarted","Data":"63a7ec774807bf4007b00b4f6cade06377028006c3eb3542cb6e72f74ff81000"} Apr 22 18:38:04.055754 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:04.055705 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9md8j" podStartSLOduration=5.565475295 podStartE2EDuration="38.055691117s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="2026-04-22 18:37:28.633074017 +0000 UTC m=+3.237228000" lastFinishedPulling="2026-04-22 18:38:01.12328983 +0000 UTC m=+35.727443822" observedRunningTime="2026-04-22 18:38:04.053721444 +0000 UTC m=+38.657875448" watchObservedRunningTime="2026-04-22 18:38:04.055691117 +0000 UTC m=+38.659845121" Apr 22 18:38:07.096289 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:07.096251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:38:07.096680 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:07.096387 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:07.096680 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:07.096404 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-689895f57b-dkjvc: secret "image-registry-tls" not found Apr 22 18:38:07.096680 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:07.096455 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls podName:51c59af5-e091-4da9-bd46-b4c3f52ba283 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:15.096441315 +0000 UTC m=+49.700595296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls") pod "image-registry-689895f57b-dkjvc" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283") : secret "image-registry-tls" not found Apr 22 18:38:07.197054 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:07.197019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:38:07.197054 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:07.197055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:38:07.197269 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:07.197176 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:07.197269 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:07.197215 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:07.197269 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:07.197232 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert podName:700dd0a8-270f-4acc-bbc0-dc7d43241d5f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:15.197218228 +0000 UTC m=+49.801372210 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert") pod "ingress-canary-srk76" (UID: "700dd0a8-270f-4acc-bbc0-dc7d43241d5f") : secret "canary-serving-cert" not found Apr 22 18:38:07.197269 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:07.197252 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls podName:8628c438-3340-49c9-bbba-e74cdc0b5926 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:15.197241594 +0000 UTC m=+49.801395576 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls") pod "dns-default-45l64" (UID: "8628c438-3340-49c9-bbba-e74cdc0b5926") : secret "dns-default-metrics-tls" not found Apr 22 18:38:15.149077 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:15.149040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:38:15.149483 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:15.149195 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:15.149483 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:15.149213 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-689895f57b-dkjvc: secret "image-registry-tls" not found Apr 22 18:38:15.149483 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:15.149264 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls podName:51c59af5-e091-4da9-bd46-b4c3f52ba283 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:31.149249911 +0000 UTC m=+65.753403897 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls") pod "image-registry-689895f57b-dkjvc" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283") : secret "image-registry-tls" not found Apr 22 18:38:15.250100 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:15.250075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:38:15.250243 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:15.250171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:38:15.250243 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:15.250211 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:15.250308 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:15.250246 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:15.250308 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:15.250264 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls podName:8628c438-3340-49c9-bbba-e74cdc0b5926 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:31.250251327 +0000 UTC m=+65.854405309 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls") pod "dns-default-45l64" (UID: "8628c438-3340-49c9-bbba-e74cdc0b5926") : secret "dns-default-metrics-tls" not found Apr 22 18:38:15.250308 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:15.250280 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert podName:700dd0a8-270f-4acc-bbc0-dc7d43241d5f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:31.250269716 +0000 UTC m=+65.854423703 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert") pod "ingress-canary-srk76" (UID: "700dd0a8-270f-4acc-bbc0-dc7d43241d5f") : secret "canary-serving-cert" not found Apr 22 18:38:25.015009 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:25.014981 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gwnpr" Apr 22 18:38:31.166864 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.166832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:38:31.167246 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:31.166932 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:31.167246 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:31.166943 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-689895f57b-dkjvc: secret "image-registry-tls" not found Apr 22 18:38:31.167246 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:31.166993 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls podName:51c59af5-e091-4da9-bd46-b4c3f52ba283 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:03.166978613 +0000 UTC m=+97.771132595 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls") pod "image-registry-689895f57b-dkjvc" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283") : secret "image-registry-tls" not found Apr 22 18:38:31.267418 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.267383 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:38:31.267538 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.267494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:38:31.267538 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:31.267525 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:31.267606 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:31.267577 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls podName:8628c438-3340-49c9-bbba-e74cdc0b5926 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:03.267563039 +0000 UTC m=+97.871717021 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls") pod "dns-default-45l64" (UID: "8628c438-3340-49c9-bbba-e74cdc0b5926") : secret "dns-default-metrics-tls" not found Apr 22 18:38:31.267606 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:31.267579 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:31.267684 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:31.267628 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert podName:700dd0a8-270f-4acc-bbc0-dc7d43241d5f nodeName:}" failed. No retries permitted until 2026-04-22 18:39:03.267613291 +0000 UTC m=+97.871767283 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert") pod "ingress-canary-srk76" (UID: "700dd0a8-270f-4acc-bbc0-dc7d43241d5f") : secret "canary-serving-cert" not found Apr 22 18:38:31.569247 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.569167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s82jd\" (UniqueName: \"kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd\") pod \"network-check-target-4fnlv\" (UID: \"2bfdae94-a7e1-48cd-8cad-b09cbb620c26\") " pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:38:31.572015 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.571995 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:38:31.582362 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.582344 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:38:31.593571 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.593549 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s82jd\" (UniqueName: \"kubernetes.io/projected/2bfdae94-a7e1-48cd-8cad-b09cbb620c26-kube-api-access-s82jd\") pod \"network-check-target-4fnlv\" (UID: \"2bfdae94-a7e1-48cd-8cad-b09cbb620c26\") " pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:38:31.669707 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.669685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:38:31.669820 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.669715 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:38:31.672258 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.672243 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:38:31.672320 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.672287 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:38:31.680069 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:31.680052 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:38:31.680173 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:31.680103 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs podName:3445bb02-600f-40d2-95c6-d4616e5630c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:35.680088995 +0000 UTC m=+130.284242977 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs") pod "network-metrics-daemon-9cw24" (UID: "3445bb02-600f-40d2-95c6-d4616e5630c5") : secret "metrics-daemon-secret" not found Apr 22 18:38:31.682155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.682117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/af564c1a-d59c-4ed2-ae28-bef2ffad4877-original-pull-secret\") pod \"global-pull-secret-syncer-k99dx\" (UID: \"af564c1a-d59c-4ed2-ae28-bef2ffad4877\") " pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:38:31.776927 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.776904 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k99dx" Apr 22 18:38:31.789394 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.789376 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-djgkc\"" Apr 22 18:38:31.797526 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.797499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:38:31.966032 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.966003 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k99dx"] Apr 22 18:38:31.968545 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:31.968511 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4fnlv"] Apr 22 18:38:31.969531 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:38:31.969508 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf564c1a_d59c_4ed2_ae28_bef2ffad4877.slice/crio-d1d5bf9324b1bbe01b6596478a185f1e784237db38a9983f188222ec36233a8e WatchSource:0}: Error finding container d1d5bf9324b1bbe01b6596478a185f1e784237db38a9983f188222ec36233a8e: Status 404 returned error can't find the container with id d1d5bf9324b1bbe01b6596478a185f1e784237db38a9983f188222ec36233a8e Apr 22 18:38:31.972332 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:38:31.972303 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bfdae94_a7e1_48cd_8cad_b09cbb620c26.slice/crio-27038b8bf65ed4815e26a48f00681f53533305ad94a1c63d92478dd5ef3178d0 WatchSource:0}: Error finding container 27038b8bf65ed4815e26a48f00681f53533305ad94a1c63d92478dd5ef3178d0: Status 404 returned error can't find the container with id 27038b8bf65ed4815e26a48f00681f53533305ad94a1c63d92478dd5ef3178d0 Apr 22 18:38:32.080465 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:32.080437 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4fnlv" event={"ID":"2bfdae94-a7e1-48cd-8cad-b09cbb620c26","Type":"ContainerStarted","Data":"27038b8bf65ed4815e26a48f00681f53533305ad94a1c63d92478dd5ef3178d0"} Apr 22 18:38:32.081357 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:32.081334 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k99dx" event={"ID":"af564c1a-d59c-4ed2-ae28-bef2ffad4877","Type":"ContainerStarted","Data":"d1d5bf9324b1bbe01b6596478a185f1e784237db38a9983f188222ec36233a8e"} Apr 22 18:38:37.093745 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:37.093707 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4fnlv" event={"ID":"2bfdae94-a7e1-48cd-8cad-b09cbb620c26","Type":"ContainerStarted","Data":"42bb67228fc574c74aac11a1084e696eddc164a2a1ab6c6eb1328f78db585166"} Apr 22 18:38:37.094195 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:37.093926 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:38:37.098923 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:37.098890 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k99dx" event={"ID":"af564c1a-d59c-4ed2-ae28-bef2ffad4877","Type":"ContainerStarted","Data":"1a87a56929264957d584dbd7774229669adc862115faf6595664d175f6294ff6"} Apr 22 18:38:37.113656 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:37.113606 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4fnlv" podStartSLOduration=66.174544959 podStartE2EDuration="1m11.113590856s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="2026-04-22 18:38:31.974865032 +0000 UTC m=+66.579019018" lastFinishedPulling="2026-04-22 18:38:36.91391093 +0000 UTC m=+71.518064915" observedRunningTime="2026-04-22 18:38:37.112645713 +0000 UTC m=+71.716799708" watchObservedRunningTime="2026-04-22 18:38:37.113590856 +0000 UTC m=+71.717744860" Apr 22 18:38:48.425859 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.425725 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-k99dx" podStartSLOduration=77.484472772 podStartE2EDuration="1m22.425710474s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="2026-04-22 18:38:31.971283441 +0000 UTC m=+66.575437423" lastFinishedPulling="2026-04-22 18:38:36.912521143 +0000 UTC m=+71.516675125" observedRunningTime="2026-04-22 18:38:37.128023006 +0000 UTC m=+71.732177059" watchObservedRunningTime="2026-04-22 18:38:48.425710474 +0000 UTC m=+83.029864530" Apr 22 18:38:48.426281 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.426033 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl"] Apr 22 18:38:48.429197 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.429177 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6f6sj"] Apr 22 18:38:48.429332 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.429318 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.431800 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.431780 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.431900 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.431801 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:38:48.431960 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.431894 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-g7zzm\"" Apr 22 18:38:48.432385 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.432359 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:48.432850 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.432826 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:38:48.433080 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.433062 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:38:48.435029 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.435004 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:48.435117 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.435032 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:38:48.435284 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.435267 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:38:48.435365 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.435341 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:38:48.435752 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.435730 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-jwqqb\"" Apr 22 18:38:48.440704 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.440686 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6f6sj"] Apr 22 18:38:48.441451 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.441432 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl"] Apr 22 18:38:48.441739 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.441723 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:38:48.493819 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.493800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-serving-cert\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.493913 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.493825 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-trusted-ca\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.493913 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.493842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf798\" (UniqueName: \"kubernetes.io/projected/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-kube-api-access-hf798\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.493983 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.493929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549f6ace-6f8f-4b1a-ab20-7550de5af2f4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-459cl\" (UID: \"549f6ace-6f8f-4b1a-ab20-7550de5af2f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.493983 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.493953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-config\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.494047 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.493994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k68kx\" (UniqueName: \"kubernetes.io/projected/549f6ace-6f8f-4b1a-ab20-7550de5af2f4-kube-api-access-k68kx\") pod \"kube-storage-version-migrator-operator-6769c5d45-459cl\" (UID: \"549f6ace-6f8f-4b1a-ab20-7550de5af2f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.494047 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.494015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549f6ace-6f8f-4b1a-ab20-7550de5af2f4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-459cl\" (UID: \"549f6ace-6f8f-4b1a-ab20-7550de5af2f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.594576 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.594550 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-serving-cert\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.594667 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.594578 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-trusted-ca\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.594667 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.594595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hf798\" (UniqueName: \"kubernetes.io/projected/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-kube-api-access-hf798\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.594667 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.594637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549f6ace-6f8f-4b1a-ab20-7550de5af2f4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-459cl\" (UID: \"549f6ace-6f8f-4b1a-ab20-7550de5af2f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.594787 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.594667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-config\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.594787 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.594689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k68kx\" (UniqueName: \"kubernetes.io/projected/549f6ace-6f8f-4b1a-ab20-7550de5af2f4-kube-api-access-k68kx\") pod \"kube-storage-version-migrator-operator-6769c5d45-459cl\" (UID: \"549f6ace-6f8f-4b1a-ab20-7550de5af2f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.594787 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.594710 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549f6ace-6f8f-4b1a-ab20-7550de5af2f4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-459cl\" (UID: \"549f6ace-6f8f-4b1a-ab20-7550de5af2f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.595523 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.595456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-config\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.595523 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.595572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-trusted-ca\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.595956 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.595928 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549f6ace-6f8f-4b1a-ab20-7550de5af2f4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-459cl\" (UID: \"549f6ace-6f8f-4b1a-ab20-7550de5af2f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.597251 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.597229 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549f6ace-6f8f-4b1a-ab20-7550de5af2f4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-459cl\" (UID: \"549f6ace-6f8f-4b1a-ab20-7550de5af2f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.597356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.597275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-serving-cert\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.604249 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.604223 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k68kx\" (UniqueName: \"kubernetes.io/projected/549f6ace-6f8f-4b1a-ab20-7550de5af2f4-kube-api-access-k68kx\") pod \"kube-storage-version-migrator-operator-6769c5d45-459cl\" (UID: \"549f6ace-6f8f-4b1a-ab20-7550de5af2f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.604900 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.604878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf798\" (UniqueName: \"kubernetes.io/projected/2508d1bd-b2cc-4edf-8bd2-34590e71e3e0-kube-api-access-hf798\") pod \"console-operator-9d4b6777b-6f6sj\" (UID: \"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0\") " pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.743483 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.743433 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" Apr 22 18:38:48.748110 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.748088 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:48.868212 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.868070 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl"] Apr 22 18:38:48.870551 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:38:48.870525 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549f6ace_6f8f_4b1a_ab20_7550de5af2f4.slice/crio-97469012ec83be655acfeb1f19c930ead2ed7e46d87194fcc4d914ea19b896e8 WatchSource:0}: Error finding container 97469012ec83be655acfeb1f19c930ead2ed7e46d87194fcc4d914ea19b896e8: Status 404 returned error can't find the container with id 97469012ec83be655acfeb1f19c930ead2ed7e46d87194fcc4d914ea19b896e8 Apr 22 18:38:48.879864 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:48.879836 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-6f6sj"] Apr 22 18:38:48.882880 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:38:48.882857 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2508d1bd_b2cc_4edf_8bd2_34590e71e3e0.slice/crio-067ac95a5d31e8d9f9aac3d5ad806363c95a061081879d7419342f7aca11d419 WatchSource:0}: Error finding container 067ac95a5d31e8d9f9aac3d5ad806363c95a061081879d7419342f7aca11d419: Status 404 returned error can't find the container with id 067ac95a5d31e8d9f9aac3d5ad806363c95a061081879d7419342f7aca11d419 Apr 22 18:38:49.120021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:49.118881 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" event={"ID":"549f6ace-6f8f-4b1a-ab20-7550de5af2f4","Type":"ContainerStarted","Data":"97469012ec83be655acfeb1f19c930ead2ed7e46d87194fcc4d914ea19b896e8"} Apr 22 18:38:49.121183 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:49.121155 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" event={"ID":"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0","Type":"ContainerStarted","Data":"067ac95a5d31e8d9f9aac3d5ad806363c95a061081879d7419342f7aca11d419"} Apr 22 18:38:51.126937 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:51.126911 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/0.log" Apr 22 18:38:51.127286 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:51.126951 2572 generic.go:358] "Generic (PLEG): container finished" podID="2508d1bd-b2cc-4edf-8bd2-34590e71e3e0" containerID="a605635d9545a8c79c34edcc7bcac7c567c799cfa9ab37916c4c6a9b766265bc" exitCode=255 Apr 22 18:38:51.127286 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:51.126988 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" event={"ID":"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0","Type":"ContainerDied","Data":"a605635d9545a8c79c34edcc7bcac7c567c799cfa9ab37916c4c6a9b766265bc"} Apr 22 18:38:51.127286 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:51.127207 2572 scope.go:117] "RemoveContainer" containerID="a605635d9545a8c79c34edcc7bcac7c567c799cfa9ab37916c4c6a9b766265bc" Apr 22 18:38:52.130951 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:52.130916 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" event={"ID":"549f6ace-6f8f-4b1a-ab20-7550de5af2f4","Type":"ContainerStarted","Data":"88321f7f649b9446959e6eb8b0876d9ff39878d9831514cd2814c34989de7e97"} Apr 22 18:38:52.132404 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:52.132384 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:38:52.132757 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:52.132739 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/0.log" Apr 22 18:38:52.132841 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:52.132780 2572 generic.go:358] "Generic (PLEG): container finished" podID="2508d1bd-b2cc-4edf-8bd2-34590e71e3e0" containerID="835af04bcd245003c9a7c9d98a4470ac3d59a55c3f70831a80c4626329080af3" exitCode=255 Apr 22 18:38:52.132841 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:52.132809 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" event={"ID":"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0","Type":"ContainerDied","Data":"835af04bcd245003c9a7c9d98a4470ac3d59a55c3f70831a80c4626329080af3"} Apr 22 18:38:52.132841 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:52.132837 2572 scope.go:117] "RemoveContainer" containerID="a605635d9545a8c79c34edcc7bcac7c567c799cfa9ab37916c4c6a9b766265bc" Apr 22 18:38:52.133065 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:52.133051 2572 scope.go:117] "RemoveContainer" containerID="835af04bcd245003c9a7c9d98a4470ac3d59a55c3f70831a80c4626329080af3" Apr 22 18:38:52.133255 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:52.133237 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6f6sj_openshift-console-operator(2508d1bd-b2cc-4edf-8bd2-34590e71e3e0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" podUID="2508d1bd-b2cc-4edf-8bd2-34590e71e3e0" Apr 22 18:38:52.147052 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:52.147014 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" podStartSLOduration=1.037289622 podStartE2EDuration="4.147003571s" podCreationTimestamp="2026-04-22 18:38:48 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.872360559 +0000 UTC m=+83.476514541" lastFinishedPulling="2026-04-22 18:38:51.982074504 +0000 UTC m=+86.586228490" observedRunningTime="2026-04-22 18:38:52.145944412 +0000 UTC m=+86.750098416" watchObservedRunningTime="2026-04-22 18:38:52.147003571 +0000 UTC m=+86.751157575" Apr 22 18:38:53.135507 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:53.135475 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:38:53.135972 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:53.135953 2572 scope.go:117] "RemoveContainer" containerID="835af04bcd245003c9a7c9d98a4470ac3d59a55c3f70831a80c4626329080af3" Apr 22 18:38:53.136152 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:53.136111 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6f6sj_openshift-console-operator(2508d1bd-b2cc-4edf-8bd2-34590e71e3e0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" podUID="2508d1bd-b2cc-4edf-8bd2-34590e71e3e0" Apr 22 18:38:54.729569 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:54.729543 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7svrr_482a486e-af9c-4b33-8137-5fbd716f3fc2/dns-node-resolver/0.log" Apr 22 18:38:55.334591 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.334561 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jmv8s"] Apr 22 18:38:55.338468 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.338453 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.341237 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.341218 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:38:55.341349 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.341289 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-w9kdb\"" Apr 22 18:38:55.341521 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.341507 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:38:55.342064 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.342052 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:38:55.342114 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.342101 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:38:55.347147 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.347105 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jmv8s"] Apr 22 18:38:55.449281 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.449252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e93259aa-b7b9-4ba3-b713-2a7d80a35046-signing-cabundle\") pod \"service-ca-865cb79987-jmv8s\" (UID: \"e93259aa-b7b9-4ba3-b713-2a7d80a35046\") " pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.449379 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.449303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvwjv\" (UniqueName: \"kubernetes.io/projected/e93259aa-b7b9-4ba3-b713-2a7d80a35046-kube-api-access-bvwjv\") pod \"service-ca-865cb79987-jmv8s\" (UID: \"e93259aa-b7b9-4ba3-b713-2a7d80a35046\") " pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.449454 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.449408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e93259aa-b7b9-4ba3-b713-2a7d80a35046-signing-key\") pod \"service-ca-865cb79987-jmv8s\" (UID: \"e93259aa-b7b9-4ba3-b713-2a7d80a35046\") " pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.550516 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.550495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e93259aa-b7b9-4ba3-b713-2a7d80a35046-signing-cabundle\") pod \"service-ca-865cb79987-jmv8s\" (UID: \"e93259aa-b7b9-4ba3-b713-2a7d80a35046\") " pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.550573 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.550534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvwjv\" (UniqueName: \"kubernetes.io/projected/e93259aa-b7b9-4ba3-b713-2a7d80a35046-kube-api-access-bvwjv\") pod \"service-ca-865cb79987-jmv8s\" (UID: \"e93259aa-b7b9-4ba3-b713-2a7d80a35046\") " pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.550607 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.550586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e93259aa-b7b9-4ba3-b713-2a7d80a35046-signing-key\") pod \"service-ca-865cb79987-jmv8s\" (UID: \"e93259aa-b7b9-4ba3-b713-2a7d80a35046\") " pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.551140 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.551104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e93259aa-b7b9-4ba3-b713-2a7d80a35046-signing-cabundle\") pod \"service-ca-865cb79987-jmv8s\" (UID: \"e93259aa-b7b9-4ba3-b713-2a7d80a35046\") " pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.552901 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.552885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e93259aa-b7b9-4ba3-b713-2a7d80a35046-signing-key\") pod \"service-ca-865cb79987-jmv8s\" (UID: \"e93259aa-b7b9-4ba3-b713-2a7d80a35046\") " pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.559745 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.559727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvwjv\" (UniqueName: \"kubernetes.io/projected/e93259aa-b7b9-4ba3-b713-2a7d80a35046-kube-api-access-bvwjv\") pod \"service-ca-865cb79987-jmv8s\" (UID: \"e93259aa-b7b9-4ba3-b713-2a7d80a35046\") " pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.647328 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.647281 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jmv8s" Apr 22 18:38:55.759657 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.759628 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jmv8s"] Apr 22 18:38:55.762290 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:38:55.762257 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93259aa_b7b9_4ba3_b713_2a7d80a35046.slice/crio-36b0ac5951ce9b2dc256b093be8411193e1de8bd0dd792a16d4280e8b928402d WatchSource:0}: Error finding container 36b0ac5951ce9b2dc256b093be8411193e1de8bd0dd792a16d4280e8b928402d: Status 404 returned error can't find the container with id 36b0ac5951ce9b2dc256b093be8411193e1de8bd0dd792a16d4280e8b928402d Apr 22 18:38:55.928390 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:55.928319 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nx5jx_d748c1b5-2936-4573-b185-b6e9da61d104/node-ca/0.log" Apr 22 18:38:56.142226 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:56.142193 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jmv8s" event={"ID":"e93259aa-b7b9-4ba3-b713-2a7d80a35046","Type":"ContainerStarted","Data":"36b0ac5951ce9b2dc256b093be8411193e1de8bd0dd792a16d4280e8b928402d"} Apr 22 18:38:58.148352 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:58.148320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jmv8s" event={"ID":"e93259aa-b7b9-4ba3-b713-2a7d80a35046","Type":"ContainerStarted","Data":"1e67b86e313449a2891cb0df2893145189741777bd5c4cfb008439e14b78bbce"} Apr 22 18:38:58.164078 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:58.164028 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-jmv8s" podStartSLOduration=1.3640663960000001 podStartE2EDuration="3.164014729s" podCreationTimestamp="2026-04-22 18:38:55 +0000 UTC" firstStartedPulling="2026-04-22 18:38:55.764034687 +0000 UTC m=+90.368188669" lastFinishedPulling="2026-04-22 18:38:57.563983007 +0000 UTC m=+92.168137002" observedRunningTime="2026-04-22 18:38:58.163804662 +0000 UTC m=+92.767958666" watchObservedRunningTime="2026-04-22 18:38:58.164014729 +0000 UTC m=+92.768168733" Apr 22 18:38:58.749214 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:58.749176 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:58.749402 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:58.749228 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:38:58.749672 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:38:58.749659 2572 scope.go:117] "RemoveContainer" containerID="835af04bcd245003c9a7c9d98a4470ac3d59a55c3f70831a80c4626329080af3" Apr 22 18:38:58.749889 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:38:58.749870 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-6f6sj_openshift-console-operator(2508d1bd-b2cc-4edf-8bd2-34590e71e3e0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" podUID="2508d1bd-b2cc-4edf-8bd2-34590e71e3e0" Apr 22 18:39:03.210072 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.210041 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:39:03.212451 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.212431 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"image-registry-689895f57b-dkjvc\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:39:03.260039 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.260013 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fbmmk\"" Apr 22 18:39:03.267969 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.267951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:39:03.310691 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.310658 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:39:03.310856 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.310711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:39:03.313953 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.313889 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8628c438-3340-49c9-bbba-e74cdc0b5926-metrics-tls\") pod \"dns-default-45l64\" (UID: \"8628c438-3340-49c9-bbba-e74cdc0b5926\") " pod="openshift-dns/dns-default-45l64" Apr 22 18:39:03.314076 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.313965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/700dd0a8-270f-4acc-bbc0-dc7d43241d5f-cert\") pod \"ingress-canary-srk76\" (UID: \"700dd0a8-270f-4acc-bbc0-dc7d43241d5f\") " pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:39:03.403189 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.403158 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-689895f57b-dkjvc"] Apr 22 18:39:03.407716 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:39:03.407691 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c59af5_e091_4da9_bd46_b4c3f52ba283.slice/crio-baecf60fb0c25c8bf63a2731a719c9e4808c9d1ef46a623959fef2787c60ae98 WatchSource:0}: Error finding container baecf60fb0c25c8bf63a2731a719c9e4808c9d1ef46a623959fef2787c60ae98: Status 404 returned error can't find the container with id baecf60fb0c25c8bf63a2731a719c9e4808c9d1ef46a623959fef2787c60ae98 Apr 22 18:39:03.572755 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.572724 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cfw2h\"" Apr 22 18:39:03.580970 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.580948 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-45l64" Apr 22 18:39:03.589866 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.589841 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rz76n\"" Apr 22 18:39:03.598777 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.598755 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-srk76" Apr 22 18:39:03.733304 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.733238 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-45l64"] Apr 22 18:39:03.738306 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:39:03.738259 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8628c438_3340_49c9_bbba_e74cdc0b5926.slice/crio-fc2639df54c248e2a71af8c7b4c58033a8c6e9988490403ac665db0067561fa9 WatchSource:0}: Error finding container fc2639df54c248e2a71af8c7b4c58033a8c6e9988490403ac665db0067561fa9: Status 404 returned error can't find the container with id fc2639df54c248e2a71af8c7b4c58033a8c6e9988490403ac665db0067561fa9 Apr 22 18:39:03.748365 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:03.747936 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-srk76"] Apr 22 18:39:03.752045 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:39:03.752018 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod700dd0a8_270f_4acc_bbc0_dc7d43241d5f.slice/crio-8dbd4e51fcd065396854c31ab425dfb79a16aeecb94229a9dc2e8b0731c6f770 WatchSource:0}: Error finding container 8dbd4e51fcd065396854c31ab425dfb79a16aeecb94229a9dc2e8b0731c6f770: Status 404 returned error can't find the container with id 8dbd4e51fcd065396854c31ab425dfb79a16aeecb94229a9dc2e8b0731c6f770 Apr 22 18:39:04.164163 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:04.164112 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-srk76" event={"ID":"700dd0a8-270f-4acc-bbc0-dc7d43241d5f","Type":"ContainerStarted","Data":"8dbd4e51fcd065396854c31ab425dfb79a16aeecb94229a9dc2e8b0731c6f770"} Apr 22 18:39:04.165086 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:04.165065 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-45l64" event={"ID":"8628c438-3340-49c9-bbba-e74cdc0b5926","Type":"ContainerStarted","Data":"fc2639df54c248e2a71af8c7b4c58033a8c6e9988490403ac665db0067561fa9"} Apr 22 18:39:04.166252 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:04.166233 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" event={"ID":"51c59af5-e091-4da9-bd46-b4c3f52ba283","Type":"ContainerStarted","Data":"a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2"} Apr 22 18:39:04.166343 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:04.166259 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" event={"ID":"51c59af5-e091-4da9-bd46-b4c3f52ba283","Type":"ContainerStarted","Data":"baecf60fb0c25c8bf63a2731a719c9e4808c9d1ef46a623959fef2787c60ae98"} Apr 22 18:39:04.166392 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:04.166376 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:39:04.187904 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:04.187866 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" podStartSLOduration=98.18785586 podStartE2EDuration="1m38.18785586s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:39:04.187304674 +0000 UTC m=+98.791458678" watchObservedRunningTime="2026-04-22 18:39:04.18785586 +0000 UTC m=+98.792009891" Apr 22 18:39:07.176997 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:07.176953 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-45l64" event={"ID":"8628c438-3340-49c9-bbba-e74cdc0b5926","Type":"ContainerStarted","Data":"7eeebc452f8db4288310c004f338ee8efbe2ce3aba1ecd7e31e5a5200dceea5b"} Apr 22 18:39:07.176997 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:07.176995 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-45l64" event={"ID":"8628c438-3340-49c9-bbba-e74cdc0b5926","Type":"ContainerStarted","Data":"ab8e5e9b97a75afa63b35d604cff9ecf634b733ecebdbb8b42c096a48f9129c9"} Apr 22 18:39:07.177549 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:07.177143 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-45l64" Apr 22 18:39:07.178230 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:07.178213 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-srk76" event={"ID":"700dd0a8-270f-4acc-bbc0-dc7d43241d5f","Type":"ContainerStarted","Data":"e612fed926ec823c3ccfb7a3e74f75f166e68c8e065d7bcca89fe8748bead723"} Apr 22 18:39:07.198337 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:07.198293 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-45l64" podStartSLOduration=65.822780853 podStartE2EDuration="1m8.198281492s" podCreationTimestamp="2026-04-22 18:37:59 +0000 UTC" firstStartedPulling="2026-04-22 18:39:03.740230313 +0000 UTC m=+98.344384297" lastFinishedPulling="2026-04-22 18:39:06.115730939 +0000 UTC m=+100.719884936" observedRunningTime="2026-04-22 18:39:07.197427204 +0000 UTC m=+101.801581216" watchObservedRunningTime="2026-04-22 18:39:07.198281492 +0000 UTC m=+101.802435495" Apr 22 18:39:07.242418 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:07.242375 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-srk76" podStartSLOduration=65.881740382 podStartE2EDuration="1m8.242360435s" podCreationTimestamp="2026-04-22 18:37:59 +0000 UTC" firstStartedPulling="2026-04-22 18:39:03.754016524 +0000 UTC m=+98.358170519" lastFinishedPulling="2026-04-22 18:39:06.114636569 +0000 UTC m=+100.718790572" observedRunningTime="2026-04-22 18:39:07.24170981 +0000 UTC m=+101.845863836" watchObservedRunningTime="2026-04-22 18:39:07.242360435 +0000 UTC m=+101.846514440" Apr 22 18:39:08.103460 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:08.103436 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4fnlv" Apr 22 18:39:08.864540 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:08.864512 2572 scope.go:117] "RemoveContainer" containerID="835af04bcd245003c9a7c9d98a4470ac3d59a55c3f70831a80c4626329080af3" Apr 22 18:39:09.184402 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:09.184319 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:39:09.184402 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:09.184373 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" event={"ID":"2508d1bd-b2cc-4edf-8bd2-34590e71e3e0","Type":"ContainerStarted","Data":"a1f899bf64ff5a5eceaba30fa75448e4d62546148cd4c9fef298a99d889cd346"} Apr 22 18:39:09.184690 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:09.184665 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:39:09.201876 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:09.201840 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" podStartSLOduration=19.444779667 podStartE2EDuration="21.201827987s" podCreationTimestamp="2026-04-22 18:38:48 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.884512425 +0000 UTC m=+83.488666407" lastFinishedPulling="2026-04-22 18:38:50.641560745 +0000 UTC m=+85.245714727" observedRunningTime="2026-04-22 18:39:09.201670626 +0000 UTC m=+103.805824645" watchObservedRunningTime="2026-04-22 18:39:09.201827987 +0000 UTC m=+103.805982030" Apr 22 18:39:09.505291 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:09.505211 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-6f6sj" Apr 22 18:39:16.574377 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.574342 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qn2jw"] Apr 22 18:39:16.577386 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.577367 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.582987 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.582956 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:39:16.583465 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.583396 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:39:16.584017 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.583994 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-x7wl2\"" Apr 22 18:39:16.584160 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.584020 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:39:16.584160 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.584038 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:39:16.597004 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.596975 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qn2jw"] Apr 22 18:39:16.712321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.712284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/355b46cc-3542-4cb1-a685-7b6485a4c874-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.712321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.712324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/355b46cc-3542-4cb1-a685-7b6485a4c874-data-volume\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.712538 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.712354 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/355b46cc-3542-4cb1-a685-7b6485a4c874-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.712538 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.712455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/355b46cc-3542-4cb1-a685-7b6485a4c874-crio-socket\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.712538 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.712482 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z72r7\" (UniqueName: \"kubernetes.io/projected/355b46cc-3542-4cb1-a685-7b6485a4c874-kube-api-access-z72r7\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.813091 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.813063 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/355b46cc-3542-4cb1-a685-7b6485a4c874-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.813091 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.813093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/355b46cc-3542-4cb1-a685-7b6485a4c874-data-volume\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.813325 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.813114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/355b46cc-3542-4cb1-a685-7b6485a4c874-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.813325 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.813196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/355b46cc-3542-4cb1-a685-7b6485a4c874-crio-socket\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.813325 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.813236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z72r7\" (UniqueName: \"kubernetes.io/projected/355b46cc-3542-4cb1-a685-7b6485a4c874-kube-api-access-z72r7\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.813481 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.813332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/355b46cc-3542-4cb1-a685-7b6485a4c874-crio-socket\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.813531 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.813484 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/355b46cc-3542-4cb1-a685-7b6485a4c874-data-volume\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.813692 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.813667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/355b46cc-3542-4cb1-a685-7b6485a4c874-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.815608 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.815593 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/355b46cc-3542-4cb1-a685-7b6485a4c874-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.824086 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.824061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z72r7\" (UniqueName: \"kubernetes.io/projected/355b46cc-3542-4cb1-a685-7b6485a4c874-kube-api-access-z72r7\") pod \"insights-runtime-extractor-qn2jw\" (UID: \"355b46cc-3542-4cb1-a685-7b6485a4c874\") " pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:16.886022 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:16.885997 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qn2jw" Apr 22 18:39:17.016216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:17.016194 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qn2jw"] Apr 22 18:39:17.017741 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:39:17.017709 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod355b46cc_3542_4cb1_a685_7b6485a4c874.slice/crio-f429bf71e42d6181a61067a3a03ba91681699cbbcd2ec599f6370908ec2a3c38 WatchSource:0}: Error finding container f429bf71e42d6181a61067a3a03ba91681699cbbcd2ec599f6370908ec2a3c38: Status 404 returned error can't find the container with id f429bf71e42d6181a61067a3a03ba91681699cbbcd2ec599f6370908ec2a3c38 Apr 22 18:39:17.183117 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:17.183058 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-45l64" Apr 22 18:39:17.208535 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:17.208506 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qn2jw" event={"ID":"355b46cc-3542-4cb1-a685-7b6485a4c874","Type":"ContainerStarted","Data":"f7ef04df7c3f12620fe47bd87893a7a8c8a62e146a7504eef044959a0c0d873c"} Apr 22 18:39:17.208535 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:17.208532 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qn2jw" event={"ID":"355b46cc-3542-4cb1-a685-7b6485a4c874","Type":"ContainerStarted","Data":"f429bf71e42d6181a61067a3a03ba91681699cbbcd2ec599f6370908ec2a3c38"} Apr 22 18:39:18.212766 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:18.212736 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qn2jw" event={"ID":"355b46cc-3542-4cb1-a685-7b6485a4c874","Type":"ContainerStarted","Data":"51960034d860e3a5fe2fd1d00762949e661833c8e6a2df499c7e3e7640c5abe4"} Apr 22 18:39:20.219865 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:20.219823 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qn2jw" event={"ID":"355b46cc-3542-4cb1-a685-7b6485a4c874","Type":"ContainerStarted","Data":"24f2c82444e31bf28fd30738312b9b50ad92bfd2b5c5cfc26f117463a693aab6"} Apr 22 18:39:20.240366 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:20.240321 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qn2jw" podStartSLOduration=2.08023497 podStartE2EDuration="4.240307596s" podCreationTimestamp="2026-04-22 18:39:16 +0000 UTC" firstStartedPulling="2026-04-22 18:39:17.068502157 +0000 UTC m=+111.672656139" lastFinishedPulling="2026-04-22 18:39:19.228574766 +0000 UTC m=+113.832728765" observedRunningTime="2026-04-22 18:39:20.238860331 +0000 UTC m=+114.843014365" watchObservedRunningTime="2026-04-22 18:39:20.240307596 +0000 UTC m=+114.844461600" Apr 22 18:39:22.689565 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:22.689515 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j"] Apr 22 18:39:22.729854 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:22.729822 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j"] Apr 22 18:39:22.729980 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:22.729946 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" Apr 22 18:39:22.732453 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:22.732433 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:39:22.732578 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:22.732442 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-g95hg\"" Apr 22 18:39:22.750374 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:22.750347 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/13f6631c-c5a4-46fe-bb58-94d757f98e9a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ghg5j\" (UID: \"13f6631c-c5a4-46fe-bb58-94d757f98e9a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" Apr 22 18:39:22.850696 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:22.850676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/13f6631c-c5a4-46fe-bb58-94d757f98e9a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ghg5j\" (UID: \"13f6631c-c5a4-46fe-bb58-94d757f98e9a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" Apr 22 18:39:22.850787 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:39:22.850774 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:39:22.850833 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:39:22.850825 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13f6631c-c5a4-46fe-bb58-94d757f98e9a-tls-certificates podName:13f6631c-c5a4-46fe-bb58-94d757f98e9a nodeName:}" failed. No retries permitted until 2026-04-22 18:39:23.350810639 +0000 UTC m=+117.954964624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/13f6631c-c5a4-46fe-bb58-94d757f98e9a-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-ghg5j" (UID: "13f6631c-c5a4-46fe-bb58-94d757f98e9a") : secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:39:23.271942 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:23.271905 2572 patch_prober.go:28] interesting pod/image-registry-689895f57b-dkjvc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:39:23.272107 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:23.271962 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" podUID="51c59af5-e091-4da9-bd46-b4c3f52ba283" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:39:23.353486 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:23.353459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/13f6631c-c5a4-46fe-bb58-94d757f98e9a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ghg5j\" (UID: \"13f6631c-c5a4-46fe-bb58-94d757f98e9a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" Apr 22 18:39:23.355998 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:23.355976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/13f6631c-c5a4-46fe-bb58-94d757f98e9a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ghg5j\" (UID: \"13f6631c-c5a4-46fe-bb58-94d757f98e9a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" Apr 22 18:39:23.638501 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:23.638469 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" Apr 22 18:39:23.755062 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:23.755035 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j"] Apr 22 18:39:23.758273 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:39:23.758247 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13f6631c_c5a4_46fe_bb58_94d757f98e9a.slice/crio-2015bfc0309aafff639e6844c0d582ba7d3c320e532679bd9948a1f52eabc65c WatchSource:0}: Error finding container 2015bfc0309aafff639e6844c0d582ba7d3c320e532679bd9948a1f52eabc65c: Status 404 returned error can't find the container with id 2015bfc0309aafff639e6844c0d582ba7d3c320e532679bd9948a1f52eabc65c Apr 22 18:39:24.234466 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:24.234434 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" event={"ID":"13f6631c-c5a4-46fe-bb58-94d757f98e9a","Type":"ContainerStarted","Data":"2015bfc0309aafff639e6844c0d582ba7d3c320e532679bd9948a1f52eabc65c"} Apr 22 18:39:25.173916 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.173888 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:39:25.238464 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.238437 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" event={"ID":"13f6631c-c5a4-46fe-bb58-94d757f98e9a","Type":"ContainerStarted","Data":"b93f6b6f6e85e116a6890b4d5bde944918e7d2cea1963844bcdc13c1ab318efb"} Apr 22 18:39:25.238668 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.238653 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" Apr 22 18:39:25.243867 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.243835 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" Apr 22 18:39:25.253360 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.253325 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ghg5j" podStartSLOduration=1.9683760719999999 podStartE2EDuration="3.253312653s" podCreationTimestamp="2026-04-22 18:39:22 +0000 UTC" firstStartedPulling="2026-04-22 18:39:23.760648329 +0000 UTC m=+118.364802316" lastFinishedPulling="2026-04-22 18:39:25.045584916 +0000 UTC m=+119.649738897" observedRunningTime="2026-04-22 18:39:25.252494076 +0000 UTC m=+119.856648094" watchObservedRunningTime="2026-04-22 18:39:25.253312653 +0000 UTC m=+119.857466657" Apr 22 18:39:25.756972 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.756944 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bz2fw"] Apr 22 18:39:25.760477 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.760446 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.762646 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.762607 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:39:25.762772 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.762692 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:39:25.763750 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.763734 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:39:25.763841 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.763795 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:39:25.763841 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.763834 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:39:25.763931 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.763840 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gjzd9\"" Apr 22 18:39:25.767993 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.767797 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bz2fw"] Apr 22 18:39:25.767993 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.767910 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/950d0214-25c1-486e-ad90-4c388b4430d2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.767993 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.767934 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/950d0214-25c1-486e-ad90-4c388b4430d2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.767993 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.767972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvtdc\" (UniqueName: \"kubernetes.io/projected/950d0214-25c1-486e-ad90-4c388b4430d2-kube-api-access-fvtdc\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.768190 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.768014 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/950d0214-25c1-486e-ad90-4c388b4430d2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.868878 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.868853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/950d0214-25c1-486e-ad90-4c388b4430d2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.869013 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.868939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/950d0214-25c1-486e-ad90-4c388b4430d2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.869013 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.868969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/950d0214-25c1-486e-ad90-4c388b4430d2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.869013 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.868999 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvtdc\" (UniqueName: \"kubernetes.io/projected/950d0214-25c1-486e-ad90-4c388b4430d2-kube-api-access-fvtdc\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.869204 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:39:25.869085 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 18:39:25.869204 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:39:25.869178 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/950d0214-25c1-486e-ad90-4c388b4430d2-prometheus-operator-tls podName:950d0214-25c1-486e-ad90-4c388b4430d2 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:26.369159088 +0000 UTC m=+120.973313070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/950d0214-25c1-486e-ad90-4c388b4430d2-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-bz2fw" (UID: "950d0214-25c1-486e-ad90-4c388b4430d2") : secret "prometheus-operator-tls" not found Apr 22 18:39:25.869729 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.869708 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/950d0214-25c1-486e-ad90-4c388b4430d2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.871098 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.871079 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/950d0214-25c1-486e-ad90-4c388b4430d2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:25.878299 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:25.878270 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvtdc\" (UniqueName: \"kubernetes.io/projected/950d0214-25c1-486e-ad90-4c388b4430d2-kube-api-access-fvtdc\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:26.372433 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:26.372388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/950d0214-25c1-486e-ad90-4c388b4430d2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:26.374912 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:26.374886 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/950d0214-25c1-486e-ad90-4c388b4430d2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bz2fw\" (UID: \"950d0214-25c1-486e-ad90-4c388b4430d2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:26.670900 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:26.670823 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" Apr 22 18:39:26.790747 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:26.790720 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bz2fw"] Apr 22 18:39:26.795899 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:39:26.795866 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod950d0214_25c1_486e_ad90_4c388b4430d2.slice/crio-9fbded4d1efeff46c6fb556f935243ef1e07b5203092245a4efdcc394f9f96fe WatchSource:0}: Error finding container 9fbded4d1efeff46c6fb556f935243ef1e07b5203092245a4efdcc394f9f96fe: Status 404 returned error can't find the container with id 9fbded4d1efeff46c6fb556f935243ef1e07b5203092245a4efdcc394f9f96fe Apr 22 18:39:27.244777 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:27.244733 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" event={"ID":"950d0214-25c1-486e-ad90-4c388b4430d2","Type":"ContainerStarted","Data":"9fbded4d1efeff46c6fb556f935243ef1e07b5203092245a4efdcc394f9f96fe"} Apr 22 18:39:28.248940 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:28.248861 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" event={"ID":"950d0214-25c1-486e-ad90-4c388b4430d2","Type":"ContainerStarted","Data":"f4c47927095405cd4418f70874bfcf6d745f1326bf3111a431fda25af65b64a9"} Apr 22 18:39:28.248940 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:28.248899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" event={"ID":"950d0214-25c1-486e-ad90-4c388b4430d2","Type":"ContainerStarted","Data":"33cb24ead1024a91cac196ea79a82b1ec2ccf784e57f7d285b698d7b30c9ead5"} Apr 22 18:39:28.266032 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:28.265992 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-bz2fw" podStartSLOduration=2.1885688070000002 podStartE2EDuration="3.265979341s" podCreationTimestamp="2026-04-22 18:39:25 +0000 UTC" firstStartedPulling="2026-04-22 18:39:26.79809762 +0000 UTC m=+121.402251602" lastFinishedPulling="2026-04-22 18:39:27.875508152 +0000 UTC m=+122.479662136" observedRunningTime="2026-04-22 18:39:28.264690022 +0000 UTC m=+122.868844036" watchObservedRunningTime="2026-04-22 18:39:28.265979341 +0000 UTC m=+122.870133344" Apr 22 18:39:30.150728 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.150693 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-c9jfr"] Apr 22 18:39:30.157424 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.157401 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.160757 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.160738 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mt2n7\"" Apr 22 18:39:30.160909 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.160761 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:39:30.160986 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.160835 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:39:30.160986 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.160838 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:39:30.200241 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.200211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.200339 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.200256 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0088099a-293c-492a-a62d-667aec16a6a4-sys\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.200339 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.200285 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0088099a-293c-492a-a62d-667aec16a6a4-root\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.200455 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.200360 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.200455 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.200400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-textfile\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.200455 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.200426 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-tls\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.200455 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.200448 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-wtmp\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.200652 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.200563 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0088099a-293c-492a-a62d-667aec16a6a4-metrics-client-ca\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.200652 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.200629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86sw4\" (UniqueName: \"kubernetes.io/projected/0088099a-293c-492a-a62d-667aec16a6a4-kube-api-access-86sw4\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301072 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301048 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0088099a-293c-492a-a62d-667aec16a6a4-sys\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301191 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0088099a-293c-492a-a62d-667aec16a6a4-root\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301191 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301191 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-textfile\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301191 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0088099a-293c-492a-a62d-667aec16a6a4-sys\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-tls\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301242 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-wtmp\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301246 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0088099a-293c-492a-a62d-667aec16a6a4-root\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0088099a-293c-492a-a62d-667aec16a6a4-metrics-client-ca\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86sw4\" (UniqueName: \"kubernetes.io/projected/0088099a-293c-492a-a62d-667aec16a6a4-kube-api-access-86sw4\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-wtmp\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301456 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.301456 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:39:30.301454 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:39:30.301824 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:39:30.301504 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-tls podName:0088099a-293c-492a-a62d-667aec16a6a4 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:30.80148896 +0000 UTC m=+125.405642962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-tls") pod "node-exporter-c9jfr" (UID: "0088099a-293c-492a-a62d-667aec16a6a4") : secret "node-exporter-tls" not found Apr 22 18:39:30.301824 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.301693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.302153 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.302100 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-textfile\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.302229 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.302165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0088099a-293c-492a-a62d-667aec16a6a4-metrics-client-ca\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.304155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.304113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.312014 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.311992 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86sw4\" (UniqueName: \"kubernetes.io/projected/0088099a-293c-492a-a62d-667aec16a6a4-kube-api-access-86sw4\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.804468 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.804438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-tls\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:30.806716 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:30.806690 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0088099a-293c-492a-a62d-667aec16a6a4-node-exporter-tls\") pod \"node-exporter-c9jfr\" (UID: \"0088099a-293c-492a-a62d-667aec16a6a4\") " pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:31.067759 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:31.067679 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c9jfr" Apr 22 18:39:31.076944 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:39:31.076916 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0088099a_293c_492a_a62d_667aec16a6a4.slice/crio-3f99902d24ab80f091fa9266afa09ddea5d1f70bda92d23d2347016419ff4b75 WatchSource:0}: Error finding container 3f99902d24ab80f091fa9266afa09ddea5d1f70bda92d23d2347016419ff4b75: Status 404 returned error can't find the container with id 3f99902d24ab80f091fa9266afa09ddea5d1f70bda92d23d2347016419ff4b75 Apr 22 18:39:31.260234 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:31.260193 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c9jfr" event={"ID":"0088099a-293c-492a-a62d-667aec16a6a4","Type":"ContainerStarted","Data":"3f99902d24ab80f091fa9266afa09ddea5d1f70bda92d23d2347016419ff4b75"} Apr 22 18:39:32.264471 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:32.264431 2572 generic.go:358] "Generic (PLEG): container finished" podID="0088099a-293c-492a-a62d-667aec16a6a4" containerID="7f688aa42a48f333ce1dfc602f0d1a45c5784b70cdeaf4a7e4f624615b8774b7" exitCode=0 Apr 22 18:39:32.264837 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:32.264518 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c9jfr" event={"ID":"0088099a-293c-492a-a62d-667aec16a6a4","Type":"ContainerDied","Data":"7f688aa42a48f333ce1dfc602f0d1a45c5784b70cdeaf4a7e4f624615b8774b7"} Apr 22 18:39:33.269545 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:33.269510 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c9jfr" event={"ID":"0088099a-293c-492a-a62d-667aec16a6a4","Type":"ContainerStarted","Data":"68d896d126708da8fb001e399ee0aa901cb3069a4a0f4023454b84e769f52f35"} Apr 22 18:39:33.269545 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:33.269545 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c9jfr" event={"ID":"0088099a-293c-492a-a62d-667aec16a6a4","Type":"ContainerStarted","Data":"76e813ac8118222a60821f17f363058d67e5e6bc3b3d16d78305878443fb9a88"} Apr 22 18:39:33.289871 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:33.289834 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-c9jfr" podStartSLOduration=2.662519659 podStartE2EDuration="3.289821352s" podCreationTimestamp="2026-04-22 18:39:30 +0000 UTC" firstStartedPulling="2026-04-22 18:39:31.079400783 +0000 UTC m=+125.683554771" lastFinishedPulling="2026-04-22 18:39:31.706702478 +0000 UTC m=+126.310856464" observedRunningTime="2026-04-22 18:39:33.288236611 +0000 UTC m=+127.892390615" watchObservedRunningTime="2026-04-22 18:39:33.289821352 +0000 UTC m=+127.893975383" Apr 22 18:39:34.895837 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:34.895809 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz"] Apr 22 18:39:34.899105 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:34.899085 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz" Apr 22 18:39:34.901168 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:34.901147 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 18:39:34.901262 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:34.901217 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-vzj5j\"" Apr 22 18:39:34.908573 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:34.908553 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz"] Apr 22 18:39:34.941276 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:34.941208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e91dd8ff-b33c-47eb-ac7c-336a3bb6f08a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6xkjz\" (UID: \"e91dd8ff-b33c-47eb-ac7c-336a3bb6f08a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz" Apr 22 18:39:35.042021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:35.041991 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e91dd8ff-b33c-47eb-ac7c-336a3bb6f08a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6xkjz\" (UID: \"e91dd8ff-b33c-47eb-ac7c-336a3bb6f08a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz" Apr 22 18:39:35.044440 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:35.044410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e91dd8ff-b33c-47eb-ac7c-336a3bb6f08a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6xkjz\" (UID: \"e91dd8ff-b33c-47eb-ac7c-336a3bb6f08a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz" Apr 22 18:39:35.209184 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:35.209094 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz" Apr 22 18:39:35.323090 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:35.323059 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz"] Apr 22 18:39:35.329364 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:39:35.329333 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode91dd8ff_b33c_47eb_ac7c_336a3bb6f08a.slice/crio-42e0b0429ce8c063a6f6e6a2cce62cb06624fbc487777be1aa4b00fac4063ba7 WatchSource:0}: Error finding container 42e0b0429ce8c063a6f6e6a2cce62cb06624fbc487777be1aa4b00fac4063ba7: Status 404 returned error can't find the container with id 42e0b0429ce8c063a6f6e6a2cce62cb06624fbc487777be1aa4b00fac4063ba7 Apr 22 18:39:35.748033 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:35.748003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:39:35.750612 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:35.750589 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3445bb02-600f-40d2-95c6-d4616e5630c5-metrics-certs\") pod \"network-metrics-daemon-9cw24\" (UID: \"3445bb02-600f-40d2-95c6-d4616e5630c5\") " pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:39:35.984439 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:35.984407 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-stzzq\"" Apr 22 18:39:35.992475 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:35.992445 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9cw24" Apr 22 18:39:36.136027 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.135343 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9cw24"] Apr 22 18:39:36.139083 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:39:36.139054 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3445bb02_600f_40d2_95c6_d4616e5630c5.slice/crio-fd107e0c205d55136fd5db06081bc5502d7d71a45bf2ab932b42d1443726c3a8 WatchSource:0}: Error finding container fd107e0c205d55136fd5db06081bc5502d7d71a45bf2ab932b42d1443726c3a8: Status 404 returned error can't find the container with id fd107e0c205d55136fd5db06081bc5502d7d71a45bf2ab932b42d1443726c3a8 Apr 22 18:39:36.280178 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.280067 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9cw24" event={"ID":"3445bb02-600f-40d2-95c6-d4616e5630c5","Type":"ContainerStarted","Data":"fd107e0c205d55136fd5db06081bc5502d7d71a45bf2ab932b42d1443726c3a8"} Apr 22 18:39:36.281210 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.281182 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz" event={"ID":"e91dd8ff-b33c-47eb-ac7c-336a3bb6f08a","Type":"ContainerStarted","Data":"42e0b0429ce8c063a6f6e6a2cce62cb06624fbc487777be1aa4b00fac4063ba7"} Apr 22 18:39:36.404930 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.404893 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:39:36.410741 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.410712 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.412941 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.412913 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:39:36.413155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.413112 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qcj2x\"" Apr 22 18:39:36.413591 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.413264 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:39:36.413684 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.413645 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:39:36.413868 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.413749 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:39:36.413868 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.413840 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:39:36.414165 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.414147 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:39:36.414458 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.414435 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:39:36.414559 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.414537 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:39:36.414714 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.414693 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:39:36.414911 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.414895 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7im6e1mtu70a9\"" Apr 22 18:39:36.415192 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.415174 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:39:36.415349 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.414895 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:39:36.417707 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.417643 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:39:36.420341 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.420321 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:39:36.423201 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.423177 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:39:36.454088 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454044 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454088 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454334 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454150 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454334 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454169 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-config-out\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454334 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454249 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454334 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454293 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454334 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454588 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454350 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454588 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454588 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrml7\" (UniqueName: \"kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-kube-api-access-qrml7\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454588 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-config\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454793 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454793 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454654 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454793 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454793 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-web-config\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454975 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454781 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454975 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.454975 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.454882 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.555891 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.555822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.555891 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.555855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-config-out\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.555891 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.555879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556176 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.555901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556176 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.555927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556176 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.555951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556176 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.555976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556176 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.555999 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrml7\" (UniqueName: \"kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-kube-api-access-qrml7\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556176 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.556027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-config\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556176 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.556072 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556176 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.556092 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556176 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.556154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556757 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.556186 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-web-config\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556757 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.556217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556757 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.556243 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556757 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.556297 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556757 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.556327 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.556757 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.556353 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.557545 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.557515 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.559938 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.559915 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.560292 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.560272 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.560806 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.560782 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.561332 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.561073 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.562692 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.562667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.563101 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.563080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.564296 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.564255 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.566252 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.566228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.566512 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.566488 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.566590 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.566558 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.566780 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.566762 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-config-out\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.567032 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.567012 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-web-config\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.567096 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.567020 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-config\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.567263 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.567241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.567977 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.567953 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.568587 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.568539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.570160 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.570140 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrml7\" (UniqueName: \"kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-kube-api-access-qrml7\") pod \"prometheus-k8s-0\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.732741 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.732715 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:36.901786 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:36.901733 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:39:36.911234 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:39:36.911203 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf77d63bf_773a_4489_9ecc_27090d27f818.slice/crio-9fc5994d434b6e83b536af6408f59ed6e791aef5be5f17ffe9c52353c7de7be5 WatchSource:0}: Error finding container 9fc5994d434b6e83b536af6408f59ed6e791aef5be5f17ffe9c52353c7de7be5: Status 404 returned error can't find the container with id 9fc5994d434b6e83b536af6408f59ed6e791aef5be5f17ffe9c52353c7de7be5 Apr 22 18:39:37.287300 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:37.287258 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz" event={"ID":"e91dd8ff-b33c-47eb-ac7c-336a3bb6f08a","Type":"ContainerStarted","Data":"f6d5e28684913e46ff96011d69b9131d6d1d52f3e46bc86ad1b48a8f7f722a57"} Apr 22 18:39:37.287753 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:37.287698 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz" Apr 22 18:39:37.288656 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:37.288625 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerStarted","Data":"9fc5994d434b6e83b536af6408f59ed6e791aef5be5f17ffe9c52353c7de7be5"} Apr 22 18:39:37.294233 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:37.294212 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz" Apr 22 18:39:37.303669 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:37.303618 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6xkjz" podStartSLOduration=1.913062839 podStartE2EDuration="3.303605825s" podCreationTimestamp="2026-04-22 18:39:34 +0000 UTC" firstStartedPulling="2026-04-22 18:39:35.331261821 +0000 UTC m=+129.935415803" lastFinishedPulling="2026-04-22 18:39:36.721804805 +0000 UTC m=+131.325958789" observedRunningTime="2026-04-22 18:39:37.303576792 +0000 UTC m=+131.907730797" watchObservedRunningTime="2026-04-22 18:39:37.303605825 +0000 UTC m=+131.907759830" Apr 22 18:39:38.292337 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:38.292308 2572 generic.go:358] "Generic (PLEG): container finished" podID="f77d63bf-773a-4489-9ecc-27090d27f818" containerID="5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8" exitCode=0 Apr 22 18:39:38.292663 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:38.292387 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerDied","Data":"5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8"} Apr 22 18:39:38.293993 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:38.293888 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9cw24" event={"ID":"3445bb02-600f-40d2-95c6-d4616e5630c5","Type":"ContainerStarted","Data":"c47dad27d349215e554a7b84f68bd3a51a9491b766f754ed6256fb7116347f27"} Apr 22 18:39:38.293993 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:38.293911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9cw24" event={"ID":"3445bb02-600f-40d2-95c6-d4616e5630c5","Type":"ContainerStarted","Data":"92a8bab9b163a941a7f517de0478de5636e46186f3d5f7611ba9456c052766d9"} Apr 22 18:39:38.339711 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:38.339667 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9cw24" podStartSLOduration=130.97297372 podStartE2EDuration="2m12.339653444s" podCreationTimestamp="2026-04-22 18:37:26 +0000 UTC" firstStartedPulling="2026-04-22 18:39:36.141137246 +0000 UTC m=+130.745291244" lastFinishedPulling="2026-04-22 18:39:37.507816982 +0000 UTC m=+132.111970968" observedRunningTime="2026-04-22 18:39:38.338063436 +0000 UTC m=+132.942217440" watchObservedRunningTime="2026-04-22 18:39:38.339653444 +0000 UTC m=+132.943807448" Apr 22 18:39:38.344549 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:38.344527 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-689895f57b-dkjvc"] Apr 22 18:39:41.309329 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:41.309250 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerStarted","Data":"24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b"} Apr 22 18:39:41.309329 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:41.309296 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerStarted","Data":"a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800"} Apr 22 18:39:43.318729 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:43.318703 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerStarted","Data":"f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218"} Apr 22 18:39:43.319047 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:43.318739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerStarted","Data":"e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c"} Apr 22 18:39:43.319047 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:43.318754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerStarted","Data":"cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2"} Apr 22 18:39:44.326037 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:44.325991 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerStarted","Data":"18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8"} Apr 22 18:39:44.357741 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:44.357679 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.135689471 podStartE2EDuration="8.357663149s" podCreationTimestamp="2026-04-22 18:39:36 +0000 UTC" firstStartedPulling="2026-04-22 18:39:36.914014772 +0000 UTC m=+131.518168755" lastFinishedPulling="2026-04-22 18:39:43.135988447 +0000 UTC m=+137.740142433" observedRunningTime="2026-04-22 18:39:44.355793028 +0000 UTC m=+138.959947032" watchObservedRunningTime="2026-04-22 18:39:44.357663149 +0000 UTC m=+138.961817152" Apr 22 18:39:46.733829 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:39:46.733799 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:03.363378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.363312 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" podUID="51c59af5-e091-4da9-bd46-b4c3f52ba283" containerName="registry" containerID="cri-o://a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2" gracePeriod=30 Apr 22 18:40:03.593599 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.593579 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:40:03.769212 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.769143 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") pod \"51c59af5-e091-4da9-bd46-b4c3f52ba283\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " Apr 22 18:40:03.769212 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.769203 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-trusted-ca\") pod \"51c59af5-e091-4da9-bd46-b4c3f52ba283\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " Apr 22 18:40:03.769418 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.769359 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgj2t\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-kube-api-access-jgj2t\") pod \"51c59af5-e091-4da9-bd46-b4c3f52ba283\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " Apr 22 18:40:03.769418 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.769407 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-bound-sa-token\") pod \"51c59af5-e091-4da9-bd46-b4c3f52ba283\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " Apr 22 18:40:03.769532 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.769445 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-installation-pull-secrets\") pod \"51c59af5-e091-4da9-bd46-b4c3f52ba283\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " Apr 22 18:40:03.769532 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.769462 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-image-registry-private-configuration\") pod \"51c59af5-e091-4da9-bd46-b4c3f52ba283\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " Apr 22 18:40:03.769532 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.769504 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51c59af5-e091-4da9-bd46-b4c3f52ba283-ca-trust-extracted\") pod \"51c59af5-e091-4da9-bd46-b4c3f52ba283\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " Apr 22 18:40:03.769682 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.769538 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "51c59af5-e091-4da9-bd46-b4c3f52ba283" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:03.769682 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.769601 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-certificates\") pod \"51c59af5-e091-4da9-bd46-b4c3f52ba283\" (UID: \"51c59af5-e091-4da9-bd46-b4c3f52ba283\") " Apr 22 18:40:03.769948 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.769922 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-trusted-ca\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:03.770271 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.770073 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "51c59af5-e091-4da9-bd46-b4c3f52ba283" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:03.772072 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.772035 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "51c59af5-e091-4da9-bd46-b4c3f52ba283" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:03.772197 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.772080 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "51c59af5-e091-4da9-bd46-b4c3f52ba283" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:03.772197 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.772091 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-kube-api-access-jgj2t" (OuterVolumeSpecName: "kube-api-access-jgj2t") pod "51c59af5-e091-4da9-bd46-b4c3f52ba283" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283"). InnerVolumeSpecName "kube-api-access-jgj2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:03.772197 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.772169 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "51c59af5-e091-4da9-bd46-b4c3f52ba283" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:03.772418 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.772393 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "51c59af5-e091-4da9-bd46-b4c3f52ba283" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:03.778498 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.778475 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51c59af5-e091-4da9-bd46-b4c3f52ba283-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "51c59af5-e091-4da9-bd46-b4c3f52ba283" (UID: "51c59af5-e091-4da9-bd46-b4c3f52ba283"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:40:03.870563 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.870545 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51c59af5-e091-4da9-bd46-b4c3f52ba283-ca-trust-extracted\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:03.870563 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.870562 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-certificates\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:03.870679 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.870574 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-registry-tls\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:03.870679 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.870584 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jgj2t\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-kube-api-access-jgj2t\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:03.870679 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.870593 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51c59af5-e091-4da9-bd46-b4c3f52ba283-bound-sa-token\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:03.870679 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.870602 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-installation-pull-secrets\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:03.870679 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:03.870612 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51c59af5-e091-4da9-bd46-b4c3f52ba283-image-registry-private-configuration\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:04.380568 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:04.380539 2572 generic.go:358] "Generic (PLEG): container finished" podID="51c59af5-e091-4da9-bd46-b4c3f52ba283" containerID="a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2" exitCode=0 Apr 22 18:40:04.380928 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:04.380575 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" event={"ID":"51c59af5-e091-4da9-bd46-b4c3f52ba283","Type":"ContainerDied","Data":"a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2"} Apr 22 18:40:04.380928 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:04.380596 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" event={"ID":"51c59af5-e091-4da9-bd46-b4c3f52ba283","Type":"ContainerDied","Data":"baecf60fb0c25c8bf63a2731a719c9e4808c9d1ef46a623959fef2787c60ae98"} Apr 22 18:40:04.380928 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:04.380597 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-689895f57b-dkjvc" Apr 22 18:40:04.380928 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:04.380610 2572 scope.go:117] "RemoveContainer" containerID="a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2" Apr 22 18:40:04.388571 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:04.388554 2572 scope.go:117] "RemoveContainer" containerID="a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2" Apr 22 18:40:04.388864 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:40:04.388819 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2\": container with ID starting with a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2 not found: ID does not exist" containerID="a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2" Apr 22 18:40:04.388917 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:04.388874 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2"} err="failed to get container status \"a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2\": rpc error: code = NotFound desc = could not find container \"a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2\": container with ID starting with a7ddc3c19667ffbe7bfad7fa1abcc9a3e6511a710b60cdfab748366851afa3f2 not found: ID does not exist" Apr 22 18:40:04.399982 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:04.399959 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-689895f57b-dkjvc"] Apr 22 18:40:04.405888 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:04.405869 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-689895f57b-dkjvc"] Apr 22 18:40:05.867633 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:05.867598 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51c59af5-e091-4da9-bd46-b4c3f52ba283" path="/var/lib/kubelet/pods/51c59af5-e091-4da9-bd46-b4c3f52ba283/volumes" Apr 22 18:40:23.435272 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:23.435194 2572 generic.go:358] "Generic (PLEG): container finished" podID="549f6ace-6f8f-4b1a-ab20-7550de5af2f4" containerID="88321f7f649b9446959e6eb8b0876d9ff39878d9831514cd2814c34989de7e97" exitCode=0 Apr 22 18:40:23.435734 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:23.435268 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" event={"ID":"549f6ace-6f8f-4b1a-ab20-7550de5af2f4","Type":"ContainerDied","Data":"88321f7f649b9446959e6eb8b0876d9ff39878d9831514cd2814c34989de7e97"} Apr 22 18:40:23.435734 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:23.435568 2572 scope.go:117] "RemoveContainer" containerID="88321f7f649b9446959e6eb8b0876d9ff39878d9831514cd2814c34989de7e97" Apr 22 18:40:24.439602 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:24.439565 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-459cl" event={"ID":"549f6ace-6f8f-4b1a-ab20-7550de5af2f4","Type":"ContainerStarted","Data":"be6b6f6aa094ea0cf56ea4cb6018f5714817fd596ddc6d0a03c0fe0278dd2032"} Apr 22 18:40:36.733501 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:36.733462 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:36.751878 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:36.751855 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:37.492585 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:37.492560 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:54.763914 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:54.763883 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:54.764541 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:54.764504 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy-thanos" containerID="cri-o://18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8" gracePeriod=600 Apr 22 18:40:54.764619 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:54.764536 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy-web" containerID="cri-o://e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c" gracePeriod=600 Apr 22 18:40:54.764619 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:54.764552 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="thanos-sidecar" containerID="cri-o://cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2" gracePeriod=600 Apr 22 18:40:54.764619 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:54.764586 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy" containerID="cri-o://f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218" gracePeriod=600 Apr 22 18:40:54.764619 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:54.764548 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="config-reloader" containerID="cri-o://24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b" gracePeriod=600 Apr 22 18:40:54.764792 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:54.764502 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="prometheus" containerID="cri-o://a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800" gracePeriod=600 Apr 22 18:40:55.528274 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:55.528237 2572 generic.go:358] "Generic (PLEG): container finished" podID="f77d63bf-773a-4489-9ecc-27090d27f818" containerID="18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8" exitCode=0 Apr 22 18:40:55.528274 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:55.528267 2572 generic.go:358] "Generic (PLEG): container finished" podID="f77d63bf-773a-4489-9ecc-27090d27f818" containerID="f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218" exitCode=0 Apr 22 18:40:55.528274 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:55.528277 2572 generic.go:358] "Generic (PLEG): container finished" podID="f77d63bf-773a-4489-9ecc-27090d27f818" containerID="cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2" exitCode=0 Apr 22 18:40:55.528274 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:55.528284 2572 generic.go:358] "Generic (PLEG): container finished" podID="f77d63bf-773a-4489-9ecc-27090d27f818" containerID="24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b" exitCode=0 Apr 22 18:40:55.528553 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:55.528291 2572 generic.go:358] "Generic (PLEG): container finished" podID="f77d63bf-773a-4489-9ecc-27090d27f818" containerID="a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800" exitCode=0 Apr 22 18:40:55.528553 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:55.528299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerDied","Data":"18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8"} Apr 22 18:40:55.528553 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:55.528335 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerDied","Data":"f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218"} Apr 22 18:40:55.528553 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:55.528345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerDied","Data":"cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2"} Apr 22 18:40:55.528553 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:55.528355 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerDied","Data":"24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b"} Apr 22 18:40:55.528553 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:55.528363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerDied","Data":"a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800"} Apr 22 18:40:56.008366 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.008346 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.044000 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.043976 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrml7\" (UniqueName: \"kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-kube-api-access-qrml7\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.044111 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044011 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-serving-certs-ca-bundle\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.044111 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044034 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-db\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.044111 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044066 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-config-out\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.044111 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044097 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-tls-assets\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044719 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-thanos-prometheus-http-client-file\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044758 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-trusted-ca-bundle\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044791 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-grpc-tls\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044813 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-config\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044856 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-rulefiles-0\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044893 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-kube-rbac-proxy\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044916 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-web-config\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044938 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.044950 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-metrics-client-certs\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.045004 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-tls\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.045321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.045038 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.046012 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.045557 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:56.046012 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.045609 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-kubelet-serving-ca-bundle\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.046012 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.045646 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-metrics-client-ca\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.046012 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.045682 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f77d63bf-773a-4489-9ecc-27090d27f818\" (UID: \"f77d63bf-773a-4489-9ecc-27090d27f818\") " Apr 22 18:40:56.046012 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.045896 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.046012 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.045914 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.046499 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.046348 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:40:56.052590 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.048282 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:56.052590 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.048808 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:56.052590 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.050255 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-kube-api-access-qrml7" (OuterVolumeSpecName: "kube-api-access-qrml7") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "kube-api-access-qrml7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:56.052590 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.050309 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-config-out" (OuterVolumeSpecName: "config-out") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:40:56.052590 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.050358 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:56.052590 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.050749 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:56.052590 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.051584 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:56.052590 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.051690 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-config" (OuterVolumeSpecName: "config") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:56.054216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.053958 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:56.054216 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.053983 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:56.054389 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.054359 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:56.054678 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.054392 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:56.054678 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.054477 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:56.055669 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.055631 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:56.066930 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.066909 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-web-config" (OuterVolumeSpecName: "web-config") pod "f77d63bf-773a-4489-9ecc-27090d27f818" (UID: "f77d63bf-773a-4489-9ecc-27090d27f818"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:56.147155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147079 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-grpc-tls\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147111 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-config\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147140 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147150 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-kube-rbac-proxy\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147155 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147159 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-web-config\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147168 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-metrics-client-certs\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147176 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147186 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147197 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147207 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f77d63bf-773a-4489-9ecc-27090d27f818-configmap-metrics-client-ca\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147216 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147224 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrml7\" (UniqueName: \"kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-kube-api-access-qrml7\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147232 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-prometheus-k8s-db\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147240 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f77d63bf-773a-4489-9ecc-27090d27f818-config-out\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147248 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f77d63bf-773a-4489-9ecc-27090d27f818-tls-assets\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.147356 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.147255 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f77d63bf-773a-4489-9ecc-27090d27f818-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:40:56.533232 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.533157 2572 generic.go:358] "Generic (PLEG): container finished" podID="f77d63bf-773a-4489-9ecc-27090d27f818" containerID="e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c" exitCode=0 Apr 22 18:40:56.533355 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.533241 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerDied","Data":"e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c"} Apr 22 18:40:56.533355 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.533270 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f77d63bf-773a-4489-9ecc-27090d27f818","Type":"ContainerDied","Data":"9fc5994d434b6e83b536af6408f59ed6e791aef5be5f17ffe9c52353c7de7be5"} Apr 22 18:40:56.533355 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.533290 2572 scope.go:117] "RemoveContainer" containerID="18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8" Apr 22 18:40:56.533500 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.533363 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.540863 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.540836 2572 scope.go:117] "RemoveContainer" containerID="f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218" Apr 22 18:40:56.547539 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.547522 2572 scope.go:117] "RemoveContainer" containerID="e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c" Apr 22 18:40:56.553858 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.553840 2572 scope.go:117] "RemoveContainer" containerID="cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2" Apr 22 18:40:56.557920 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.557901 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:56.561873 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.561220 2572 scope.go:117] "RemoveContainer" containerID="24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b" Apr 22 18:40:56.562658 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.562630 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:56.567254 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.567238 2572 scope.go:117] "RemoveContainer" containerID="a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800" Apr 22 18:40:56.573636 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.573620 2572 scope.go:117] "RemoveContainer" containerID="5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8" Apr 22 18:40:56.579292 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.579278 2572 scope.go:117] "RemoveContainer" containerID="18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8" Apr 22 18:40:56.579540 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:40:56.579520 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8\": container with ID starting with 18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8 not found: ID does not exist" containerID="18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8" Apr 22 18:40:56.579584 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.579548 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8"} err="failed to get container status \"18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8\": rpc error: code = NotFound desc = could not find container \"18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8\": container with ID starting with 18c0df7d0fd52cc22e53ba9ce3f302110421a52adfeb3966473b3bcb8d9d02c8 not found: ID does not exist" Apr 22 18:40:56.579584 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.579565 2572 scope.go:117] "RemoveContainer" containerID="f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218" Apr 22 18:40:56.579830 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:40:56.579800 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218\": container with ID starting with f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218 not found: ID does not exist" containerID="f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218" Apr 22 18:40:56.579932 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.579830 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218"} err="failed to get container status \"f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218\": rpc error: code = NotFound desc = could not find container \"f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218\": container with ID starting with f6c55f3654455343a65079eebdba5b6483d7d2866465d0cefb3f4dbed6136218 not found: ID does not exist" Apr 22 18:40:56.579932 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.579851 2572 scope.go:117] "RemoveContainer" containerID="e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c" Apr 22 18:40:56.580093 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:40:56.580078 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c\": container with ID starting with e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c not found: ID does not exist" containerID="e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c" Apr 22 18:40:56.580162 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.580099 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c"} err="failed to get container status \"e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c\": rpc error: code = NotFound desc = could not find container \"e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c\": container with ID starting with e0dba9e529729a6c225cf8f4246b736198814810eab2465002c4d612e9e9e49c not found: ID does not exist" Apr 22 18:40:56.580162 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.580137 2572 scope.go:117] "RemoveContainer" containerID="cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2" Apr 22 18:40:56.580419 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:40:56.580394 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2\": container with ID starting with cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2 not found: ID does not exist" containerID="cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2" Apr 22 18:40:56.580512 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.580425 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2"} err="failed to get container status \"cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2\": rpc error: code = NotFound desc = could not find container \"cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2\": container with ID starting with cf5955d480f9a31485cc1377851248fcdb8b21ee4675db89b2163d7abcd22ef2 not found: ID does not exist" Apr 22 18:40:56.580512 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.580440 2572 scope.go:117] "RemoveContainer" containerID="24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b" Apr 22 18:40:56.580670 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:40:56.580654 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b\": container with ID starting with 24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b not found: ID does not exist" containerID="24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b" Apr 22 18:40:56.580706 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.580674 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b"} err="failed to get container status \"24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b\": rpc error: code = NotFound desc = could not find container \"24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b\": container with ID starting with 24f601fc3203f02adab6480e74162490508984b12b17ec524e5520b903d4389b not found: ID does not exist" Apr 22 18:40:56.580706 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.580686 2572 scope.go:117] "RemoveContainer" containerID="a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800" Apr 22 18:40:56.580917 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:40:56.580901 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800\": container with ID starting with a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800 not found: ID does not exist" containerID="a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800" Apr 22 18:40:56.580955 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.580923 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800"} err="failed to get container status \"a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800\": rpc error: code = NotFound desc = could not find container \"a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800\": container with ID starting with a7a0a1ddab059ab72d39bc8533321b399a7954369d98eb8f6633bd157bf46800 not found: ID does not exist" Apr 22 18:40:56.580955 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.580938 2572 scope.go:117] "RemoveContainer" containerID="5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8" Apr 22 18:40:56.581173 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:40:56.581157 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8\": container with ID starting with 5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8 not found: ID does not exist" containerID="5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8" Apr 22 18:40:56.581221 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.581178 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8"} err="failed to get container status \"5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8\": rpc error: code = NotFound desc = could not find container \"5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8\": container with ID starting with 5860eaedd864c387e4c24bf411199aa018fb48e16d2051c23a70264cb3c4bfe8 not found: ID does not exist" Apr 22 18:40:56.589981 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.589961 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:56.590262 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590249 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="prometheus" Apr 22 18:40:56.590311 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590264 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="prometheus" Apr 22 18:40:56.590311 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590272 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="config-reloader" Apr 22 18:40:56.590311 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590278 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="config-reloader" Apr 22 18:40:56.590311 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590286 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy-web" Apr 22 18:40:56.590311 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590292 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy-web" Apr 22 18:40:56.590311 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590304 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy-thanos" Apr 22 18:40:56.590311 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590311 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy-thanos" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590326 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="init-config-reloader" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590334 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="init-config-reloader" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590345 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51c59af5-e091-4da9-bd46-b4c3f52ba283" containerName="registry" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590350 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c59af5-e091-4da9-bd46-b4c3f52ba283" containerName="registry" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590355 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590361 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590367 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="thanos-sidecar" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590372 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="thanos-sidecar" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590418 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="51c59af5-e091-4da9-bd46-b4c3f52ba283" containerName="registry" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590428 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="config-reloader" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590434 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy-web" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590440 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy-thanos" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590446 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="prometheus" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590453 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="thanos-sidecar" Apr 22 18:40:56.590510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.590458 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" containerName="kube-rbac-proxy" Apr 22 18:40:56.595682 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.595666 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.598611 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.598582 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:40:56.598764 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.598633 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:40:56.598764 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.598645 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:40:56.598764 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.598697 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:40:56.598764 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.598725 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:40:56.599007 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.598789 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:40:56.599007 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.598961 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:40:56.599108 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.599031 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:40:56.599207 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.599190 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:40:56.599570 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.599550 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qcj2x\"" Apr 22 18:40:56.599681 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.599665 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:40:56.599742 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.599729 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:40:56.601991 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.601974 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7im6e1mtu70a9\"" Apr 22 18:40:56.604080 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.604062 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:40:56.606328 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.606313 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:40:56.609477 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.609452 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:56.650386 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650361 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq64w\" (UniqueName: \"kubernetes.io/projected/f26585cf-02f2-4a4c-81f8-febebc4df42e-kube-api-access-fq64w\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650471 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650395 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-config\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650471 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650471 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650471 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650628 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650499 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f26585cf-02f2-4a4c-81f8-febebc4df42e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650628 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650512 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f26585cf-02f2-4a4c-81f8-febebc4df42e-config-out\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650628 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650528 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650628 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650563 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650628 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650587 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650628 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650817 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650817 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650817 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650690 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-web-config\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650817 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650817 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650752 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f26585cf-02f2-4a4c-81f8-febebc4df42e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650817 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.650817 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.650798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.751773 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.751740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq64w\" (UniqueName: \"kubernetes.io/projected/f26585cf-02f2-4a4c-81f8-febebc4df42e-kube-api-access-fq64w\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.751869 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.751782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-config\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.751869 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.751799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.751869 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.751815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.751869 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.751831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.751869 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.751859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f26585cf-02f2-4a4c-81f8-febebc4df42e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.752091 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.751986 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f26585cf-02f2-4a4c-81f8-febebc4df42e-config-out\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.752091 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752033 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.752091 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752068 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.752619 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.752711 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.752775 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752719 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.752775 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.753040 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.753040 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752801 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-web-config\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.753040 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.753040 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.753040 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752888 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f26585cf-02f2-4a4c-81f8-febebc4df42e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.753040 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.753040 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.752941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.754070 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.753774 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.755089 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.755057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f26585cf-02f2-4a4c-81f8-febebc4df42e-config-out\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.755210 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.755196 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f26585cf-02f2-4a4c-81f8-febebc4df42e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.755275 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.755231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.755429 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.755406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f26585cf-02f2-4a4c-81f8-febebc4df42e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.755508 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.755443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-config\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.755711 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.755693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.755791 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.755771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-web-config\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.756378 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.756259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.756495 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.756299 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f26585cf-02f2-4a4c-81f8-febebc4df42e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.756569 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.756495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.757430 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.757410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.757840 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.757825 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.758382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.758360 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.758561 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.758545 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f26585cf-02f2-4a4c-81f8-febebc4df42e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.761669 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.761654 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq64w\" (UniqueName: \"kubernetes.io/projected/f26585cf-02f2-4a4c-81f8-febebc4df42e-kube-api-access-fq64w\") pod \"prometheus-k8s-0\" (UID: \"f26585cf-02f2-4a4c-81f8-febebc4df42e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:56.906736 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:56.906719 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:57.040637 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:57.040607 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:57.044537 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:40:57.044510 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26585cf_02f2_4a4c_81f8_febebc4df42e.slice/crio-04321ecd2875f5c7e8172325fd93f46050b01a3a4a1799245fb5e9f5382ce9cc WatchSource:0}: Error finding container 04321ecd2875f5c7e8172325fd93f46050b01a3a4a1799245fb5e9f5382ce9cc: Status 404 returned error can't find the container with id 04321ecd2875f5c7e8172325fd93f46050b01a3a4a1799245fb5e9f5382ce9cc Apr 22 18:40:57.538833 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:57.538761 2572 generic.go:358] "Generic (PLEG): container finished" podID="f26585cf-02f2-4a4c-81f8-febebc4df42e" containerID="4d0840161e723e1ef2808147ef20fa3daf5a06f027a0399d84800405d13e364f" exitCode=0 Apr 22 18:40:57.538964 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:57.538844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f26585cf-02f2-4a4c-81f8-febebc4df42e","Type":"ContainerDied","Data":"4d0840161e723e1ef2808147ef20fa3daf5a06f027a0399d84800405d13e364f"} Apr 22 18:40:57.538964 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:57.538878 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f26585cf-02f2-4a4c-81f8-febebc4df42e","Type":"ContainerStarted","Data":"04321ecd2875f5c7e8172325fd93f46050b01a3a4a1799245fb5e9f5382ce9cc"} Apr 22 18:40:57.871676 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:57.871630 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f77d63bf-773a-4489-9ecc-27090d27f818" path="/var/lib/kubelet/pods/f77d63bf-773a-4489-9ecc-27090d27f818/volumes" Apr 22 18:40:58.545200 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:58.545166 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f26585cf-02f2-4a4c-81f8-febebc4df42e","Type":"ContainerStarted","Data":"c9a326db7543747e4921044e54516212aa380acc3be2b6427aaab603cb9cbd78"} Apr 22 18:40:58.545200 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:58.545200 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f26585cf-02f2-4a4c-81f8-febebc4df42e","Type":"ContainerStarted","Data":"3ff30bb3e74def88cb345ef7082b421115334839d265cee055391b91cb354d3c"} Apr 22 18:40:58.545598 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:58.545210 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f26585cf-02f2-4a4c-81f8-febebc4df42e","Type":"ContainerStarted","Data":"d9cb9b5b3ec06325aa807159e434479e6ba24b8cad64ca592fe540e9a8a42601"} Apr 22 18:40:58.545598 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:58.545219 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f26585cf-02f2-4a4c-81f8-febebc4df42e","Type":"ContainerStarted","Data":"5f3ef41a7c75740437ba4e2c4a09c061e01fedceedbb7a9b1b2cc1c2bf78914a"} Apr 22 18:40:58.545598 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:58.545229 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f26585cf-02f2-4a4c-81f8-febebc4df42e","Type":"ContainerStarted","Data":"fcfb253db139dfc48c9f9e401fe4a2df181335565d411053168e731190acc84a"} Apr 22 18:40:58.545598 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:58.545237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f26585cf-02f2-4a4c-81f8-febebc4df42e","Type":"ContainerStarted","Data":"a41852b62cb5c8a21c67e99aac9ceb2c8bce443202f6677ebf304e9c365fe071"} Apr 22 18:40:58.574025 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:40:58.573974 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.573961608 podStartE2EDuration="2.573961608s" podCreationTimestamp="2026-04-22 18:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:40:58.572563346 +0000 UTC m=+213.176717350" watchObservedRunningTime="2026-04-22 18:40:58.573961608 +0000 UTC m=+213.178115674" Apr 22 18:41:01.907738 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:01.907710 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:41:50.293670 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.293590 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78"] Apr 22 18:41:50.296639 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.296621 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.299205 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.299184 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:41:50.299334 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.299184 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:41:50.299433 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.299418 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:41:50.299519 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.299504 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:41:50.300514 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.300498 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:41:50.300614 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.300539 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:41:50.300614 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.300548 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:41:50.308819 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.308799 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78"] Apr 22 18:41:50.441038 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.441011 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-ca\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.441175 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.441045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.441175 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.441140 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.441274 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.441176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-hub\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.441274 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.441198 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4c55\" (UniqueName: \"kubernetes.io/projected/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-kube-api-access-k4c55\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.441274 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.441241 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.541686 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.541656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-hub\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.541790 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.541696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4c55\" (UniqueName: \"kubernetes.io/projected/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-kube-api-access-k4c55\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.541790 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.541722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.541790 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.541781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-ca\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.541955 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.541804 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.541955 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.541859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.542554 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.542533 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.544496 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.544436 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-ca\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.544496 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.544443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.544603 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.544544 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-hub\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.544666 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.544651 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.550627 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.550609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4c55\" (UniqueName: \"kubernetes.io/projected/ab7e4cb4-8164-400e-90ed-d1c9f1c2a933-kube-api-access-k4c55\") pod \"cluster-proxy-proxy-agent-84c66d7898-jmw78\" (UID: \"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.621662 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.621637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" Apr 22 18:41:50.742783 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:50.742751 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78"] Apr 22 18:41:50.745939 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:41:50.745906 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab7e4cb4_8164_400e_90ed_d1c9f1c2a933.slice/crio-6204ab8f6d32475c00d9b08c311d9a4f5c3973b384dc57f78547bf537f08102f WatchSource:0}: Error finding container 6204ab8f6d32475c00d9b08c311d9a4f5c3973b384dc57f78547bf537f08102f: Status 404 returned error can't find the container with id 6204ab8f6d32475c00d9b08c311d9a4f5c3973b384dc57f78547bf537f08102f Apr 22 18:41:51.701465 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:51.701403 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" event={"ID":"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933","Type":"ContainerStarted","Data":"6204ab8f6d32475c00d9b08c311d9a4f5c3973b384dc57f78547bf537f08102f"} Apr 22 18:41:54.710338 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:54.710300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" event={"ID":"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933","Type":"ContainerStarted","Data":"3b8a45dcdb18385c8cefabf2ec7e97e9ac78845251d74108fb2ac45ab363c143"} Apr 22 18:41:56.907463 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:56.907422 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:41:56.926432 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:56.926396 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:41:57.720583 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:57.720540 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" event={"ID":"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933","Type":"ContainerStarted","Data":"e5c53a0f7683d036a8cdcb33ace17fbe54c16666772fa2803a62484558f1ef04"} Apr 22 18:41:57.720583 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:57.720585 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" event={"ID":"ab7e4cb4-8164-400e-90ed-d1c9f1c2a933","Type":"ContainerStarted","Data":"9365d21db2bc60a9392c318d17eeb38cd77f6b32647de721200c178c6d24bed8"} Apr 22 18:41:57.736465 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:57.736441 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:41:57.742223 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:41:57.742174 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c66d7898-jmw78" podStartSLOduration=1.490854257 podStartE2EDuration="7.742160823s" podCreationTimestamp="2026-04-22 18:41:50 +0000 UTC" firstStartedPulling="2026-04-22 18:41:50.747675243 +0000 UTC m=+265.351829228" lastFinishedPulling="2026-04-22 18:41:56.998981813 +0000 UTC m=+271.603135794" observedRunningTime="2026-04-22 18:41:57.740640907 +0000 UTC m=+272.344794924" watchObservedRunningTime="2026-04-22 18:41:57.742160823 +0000 UTC m=+272.346314826" Apr 22 18:42:25.781105 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:42:25.781080 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:42:25.781525 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:42:25.781089 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:42:25.785783 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:42:25.785762 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:42:25.786095 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:42:25.786079 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:42:25.790579 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:42:25.790559 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:43:51.577519 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.577442 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6"] Apr 22 18:43:51.579472 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.579457 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" Apr 22 18:43:51.581786 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.581761 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:43:51.581901 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.581797 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:43:51.582641 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.582626 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-bbfpb\"" Apr 22 18:43:51.590468 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.590448 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6"] Apr 22 18:43:51.676169 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.676138 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86957f8e-9ab0-4e50-b65d-3b12fff12553-tmp\") pod \"openshift-lws-operator-bfc7f696d-h72t6\" (UID: \"86957f8e-9ab0-4e50-b65d-3b12fff12553\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" Apr 22 18:43:51.676169 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.676173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fbrm\" (UniqueName: \"kubernetes.io/projected/86957f8e-9ab0-4e50-b65d-3b12fff12553-kube-api-access-8fbrm\") pod \"openshift-lws-operator-bfc7f696d-h72t6\" (UID: \"86957f8e-9ab0-4e50-b65d-3b12fff12553\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" Apr 22 18:43:51.776568 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.776546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86957f8e-9ab0-4e50-b65d-3b12fff12553-tmp\") pod \"openshift-lws-operator-bfc7f696d-h72t6\" (UID: \"86957f8e-9ab0-4e50-b65d-3b12fff12553\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" Apr 22 18:43:51.776666 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.776572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbrm\" (UniqueName: \"kubernetes.io/projected/86957f8e-9ab0-4e50-b65d-3b12fff12553-kube-api-access-8fbrm\") pod \"openshift-lws-operator-bfc7f696d-h72t6\" (UID: \"86957f8e-9ab0-4e50-b65d-3b12fff12553\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" Apr 22 18:43:51.776900 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.776881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86957f8e-9ab0-4e50-b65d-3b12fff12553-tmp\") pod \"openshift-lws-operator-bfc7f696d-h72t6\" (UID: \"86957f8e-9ab0-4e50-b65d-3b12fff12553\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" Apr 22 18:43:51.786360 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.786340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fbrm\" (UniqueName: \"kubernetes.io/projected/86957f8e-9ab0-4e50-b65d-3b12fff12553-kube-api-access-8fbrm\") pod \"openshift-lws-operator-bfc7f696d-h72t6\" (UID: \"86957f8e-9ab0-4e50-b65d-3b12fff12553\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" Apr 22 18:43:51.888706 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:51.888653 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" Apr 22 18:43:52.004894 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:52.004828 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6"] Apr 22 18:43:52.007888 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:43:52.007862 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86957f8e_9ab0_4e50_b65d_3b12fff12553.slice/crio-feeb4dcf0a08bc191b8e667c86a2067865054ff660b2c243ddca655b308717d5 WatchSource:0}: Error finding container feeb4dcf0a08bc191b8e667c86a2067865054ff660b2c243ddca655b308717d5: Status 404 returned error can't find the container with id feeb4dcf0a08bc191b8e667c86a2067865054ff660b2c243ddca655b308717d5 Apr 22 18:43:52.009235 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:52.009220 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:43:52.040077 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:52.040050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" event={"ID":"86957f8e-9ab0-4e50-b65d-3b12fff12553","Type":"ContainerStarted","Data":"feeb4dcf0a08bc191b8e667c86a2067865054ff660b2c243ddca655b308717d5"} Apr 22 18:43:55.050775 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:55.050703 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" event={"ID":"86957f8e-9ab0-4e50-b65d-3b12fff12553","Type":"ContainerStarted","Data":"cbb83718aa35602fb5a7e9a919c81b19caa3a748247ff88c2d70ae32fe693f74"} Apr 22 18:43:55.069731 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:43:55.069686 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h72t6" podStartSLOduration=1.402640918 podStartE2EDuration="4.069673292s" podCreationTimestamp="2026-04-22 18:43:51 +0000 UTC" firstStartedPulling="2026-04-22 18:43:52.009339952 +0000 UTC m=+386.613493933" lastFinishedPulling="2026-04-22 18:43:54.676372325 +0000 UTC m=+389.280526307" observedRunningTime="2026-04-22 18:43:55.068489243 +0000 UTC m=+389.672643251" watchObservedRunningTime="2026-04-22 18:43:55.069673292 +0000 UTC m=+389.673827296" Apr 22 18:44:18.737202 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.737171 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd"] Apr 22 18:44:18.739481 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.739459 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.744268 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.744248 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 18:44:18.744361 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.744253 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 18:44:18.744571 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.744556 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 18:44:18.744638 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.744575 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-cpc6t\"" Apr 22 18:44:18.757184 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.757164 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd"] Apr 22 18:44:18.768034 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.768011 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/631657e2-8aea-48cd-b04c-05ad688d76da-cert\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.768150 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.768050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/631657e2-8aea-48cd-b04c-05ad688d76da-manager-config\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.768150 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.768081 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/631657e2-8aea-48cd-b04c-05ad688d76da-metrics-cert\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.768150 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.768139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bjp9\" (UniqueName: \"kubernetes.io/projected/631657e2-8aea-48cd-b04c-05ad688d76da-kube-api-access-5bjp9\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.869518 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.869487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/631657e2-8aea-48cd-b04c-05ad688d76da-cert\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.869518 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.869522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/631657e2-8aea-48cd-b04c-05ad688d76da-manager-config\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.869746 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.869542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/631657e2-8aea-48cd-b04c-05ad688d76da-metrics-cert\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.869746 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.869655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bjp9\" (UniqueName: \"kubernetes.io/projected/631657e2-8aea-48cd-b04c-05ad688d76da-kube-api-access-5bjp9\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.870161 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.870118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/631657e2-8aea-48cd-b04c-05ad688d76da-manager-config\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.872023 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.872001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/631657e2-8aea-48cd-b04c-05ad688d76da-cert\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.872215 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.872198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/631657e2-8aea-48cd-b04c-05ad688d76da-metrics-cert\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:18.884193 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:18.884167 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bjp9\" (UniqueName: \"kubernetes.io/projected/631657e2-8aea-48cd-b04c-05ad688d76da-kube-api-access-5bjp9\") pod \"lws-controller-manager-59b94d4c58-5nzpd\" (UID: \"631657e2-8aea-48cd-b04c-05ad688d76da\") " pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:19.048098 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:19.048026 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:19.180621 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:19.180565 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd"] Apr 22 18:44:19.183075 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:44:19.183051 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod631657e2_8aea_48cd_b04c_05ad688d76da.slice/crio-b0a3f433c5095fd93539c4f7f01271094f50f1f3c1b19a68368d175f7c3974bf WatchSource:0}: Error finding container b0a3f433c5095fd93539c4f7f01271094f50f1f3c1b19a68368d175f7c3974bf: Status 404 returned error can't find the container with id b0a3f433c5095fd93539c4f7f01271094f50f1f3c1b19a68368d175f7c3974bf Apr 22 18:44:20.121433 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:20.121396 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" event={"ID":"631657e2-8aea-48cd-b04c-05ad688d76da","Type":"ContainerStarted","Data":"b0a3f433c5095fd93539c4f7f01271094f50f1f3c1b19a68368d175f7c3974bf"} Apr 22 18:44:21.125701 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:21.125667 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" event={"ID":"631657e2-8aea-48cd-b04c-05ad688d76da","Type":"ContainerStarted","Data":"e00bdce0d1b4a3d6047f9d97c71afe853dcfa55707bdb575f95ef5ac221bf35f"} Apr 22 18:44:21.126036 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:21.125751 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:44:21.148327 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:21.148287 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" podStartSLOduration=1.372129766 podStartE2EDuration="3.148274917s" podCreationTimestamp="2026-04-22 18:44:18 +0000 UTC" firstStartedPulling="2026-04-22 18:44:19.184671726 +0000 UTC m=+413.788825711" lastFinishedPulling="2026-04-22 18:44:20.960816873 +0000 UTC m=+415.564970862" observedRunningTime="2026-04-22 18:44:21.14685779 +0000 UTC m=+415.751011795" watchObservedRunningTime="2026-04-22 18:44:21.148274917 +0000 UTC m=+415.752428920" Apr 22 18:44:32.131476 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:44:32.131448 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-59b94d4c58-5nzpd" Apr 22 18:45:20.215341 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.215303 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9"] Apr 22 18:45:20.218383 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.218366 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9" Apr 22 18:45:20.220773 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.220744 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-wq5cq\"" Apr 22 18:45:20.221019 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.220992 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:45:20.221159 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.221062 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:45:20.221159 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.221062 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 18:45:20.231849 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.231823 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9"] Apr 22 18:45:20.318089 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.318061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkntm\" (UniqueName: \"kubernetes.io/projected/63cc8843-8d69-4cab-87bc-be1697b84e55-kube-api-access-zkntm\") pod \"dns-operator-controller-manager-844548ff4c-wwkv9\" (UID: \"63cc8843-8d69-4cab-87bc-be1697b84e55\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9" Apr 22 18:45:20.418873 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.418843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkntm\" (UniqueName: \"kubernetes.io/projected/63cc8843-8d69-4cab-87bc-be1697b84e55-kube-api-access-zkntm\") pod \"dns-operator-controller-manager-844548ff4c-wwkv9\" (UID: \"63cc8843-8d69-4cab-87bc-be1697b84e55\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9" Apr 22 18:45:20.427344 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.427322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkntm\" (UniqueName: \"kubernetes.io/projected/63cc8843-8d69-4cab-87bc-be1697b84e55-kube-api-access-zkntm\") pod \"dns-operator-controller-manager-844548ff4c-wwkv9\" (UID: \"63cc8843-8d69-4cab-87bc-be1697b84e55\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9" Apr 22 18:45:20.528469 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.528412 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9" Apr 22 18:45:20.654049 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:20.653996 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9"] Apr 22 18:45:20.656916 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:45:20.656887 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63cc8843_8d69_4cab_87bc_be1697b84e55.slice/crio-f372dc0cb25fe1b1fb4021cce258e082a43ae66f96422e1d57e82c0144cb94de WatchSource:0}: Error finding container f372dc0cb25fe1b1fb4021cce258e082a43ae66f96422e1d57e82c0144cb94de: Status 404 returned error can't find the container with id f372dc0cb25fe1b1fb4021cce258e082a43ae66f96422e1d57e82c0144cb94de Apr 22 18:45:21.290583 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:21.290553 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9" event={"ID":"63cc8843-8d69-4cab-87bc-be1697b84e55","Type":"ContainerStarted","Data":"f372dc0cb25fe1b1fb4021cce258e082a43ae66f96422e1d57e82c0144cb94de"} Apr 22 18:45:23.685690 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:23.685660 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk"] Apr 22 18:45:23.690393 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:23.690369 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" Apr 22 18:45:23.692611 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:23.692593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6h7rc\"" Apr 22 18:45:23.700442 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:23.700417 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk"] Apr 22 18:45:23.844816 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:23.844778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-6dzjk\" (UID: \"e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" Apr 22 18:45:23.844968 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:23.844834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nj7j\" (UniqueName: \"kubernetes.io/projected/e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3-kube-api-access-9nj7j\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-6dzjk\" (UID: \"e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" Apr 22 18:45:23.945855 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:23.945762 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-6dzjk\" (UID: \"e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" Apr 22 18:45:23.945855 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:23.945836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nj7j\" (UniqueName: \"kubernetes.io/projected/e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3-kube-api-access-9nj7j\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-6dzjk\" (UID: \"e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" Apr 22 18:45:23.946261 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:23.946209 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-6dzjk\" (UID: \"e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" Apr 22 18:45:23.964298 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:23.964243 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nj7j\" (UniqueName: \"kubernetes.io/projected/e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3-kube-api-access-9nj7j\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-6dzjk\" (UID: \"e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" Apr 22 18:45:24.001937 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:24.001885 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" Apr 22 18:45:24.163429 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:24.163388 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk"] Apr 22 18:45:24.167450 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:45:24.167420 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4b0f1ed_3a1b_48c7_8d62_dc137f3dcdb3.slice/crio-37fdee1d02b50cb33e2cb3f301463663b73fc31920deb553231c1cb09adab8de WatchSource:0}: Error finding container 37fdee1d02b50cb33e2cb3f301463663b73fc31920deb553231c1cb09adab8de: Status 404 returned error can't find the container with id 37fdee1d02b50cb33e2cb3f301463663b73fc31920deb553231c1cb09adab8de Apr 22 18:45:24.301741 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:24.301669 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" event={"ID":"e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3","Type":"ContainerStarted","Data":"37fdee1d02b50cb33e2cb3f301463663b73fc31920deb553231c1cb09adab8de"} Apr 22 18:45:26.312061 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:26.312026 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9" event={"ID":"63cc8843-8d69-4cab-87bc-be1697b84e55","Type":"ContainerStarted","Data":"89a0e3d9ad12240621203129ecea8077fba4e94b02d729699004f7784a123b1e"} Apr 22 18:45:26.312534 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:26.312279 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9" Apr 22 18:45:26.339305 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:26.339261 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9" podStartSLOduration=1.383352852 podStartE2EDuration="6.339248749s" podCreationTimestamp="2026-04-22 18:45:20 +0000 UTC" firstStartedPulling="2026-04-22 18:45:20.659480908 +0000 UTC m=+475.263634890" lastFinishedPulling="2026-04-22 18:45:25.615376796 +0000 UTC m=+480.219530787" observedRunningTime="2026-04-22 18:45:26.337141389 +0000 UTC m=+480.941295387" watchObservedRunningTime="2026-04-22 18:45:26.339248749 +0000 UTC m=+480.943402753" Apr 22 18:45:29.323248 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:29.323156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" event={"ID":"e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3","Type":"ContainerStarted","Data":"cf3de8f8d56219acfd66dc131979d83037dc7f59dbdab9dca666b845dc2a636f"} Apr 22 18:45:29.323613 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:29.323282 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" Apr 22 18:45:29.344893 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:29.344839 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" podStartSLOduration=1.67288455 podStartE2EDuration="6.344820638s" podCreationTimestamp="2026-04-22 18:45:23 +0000 UTC" firstStartedPulling="2026-04-22 18:45:24.171050754 +0000 UTC m=+478.775204749" lastFinishedPulling="2026-04-22 18:45:28.842986852 +0000 UTC m=+483.447140837" observedRunningTime="2026-04-22 18:45:29.343043399 +0000 UTC m=+483.947197402" watchObservedRunningTime="2026-04-22 18:45:29.344820638 +0000 UTC m=+483.948974643" Apr 22 18:45:37.318545 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:37.318512 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-wwkv9" Apr 22 18:45:40.328602 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:45:40.328574 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-6dzjk" Apr 22 18:46:16.787753 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:16.787678 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-qmgx9"] Apr 22 18:46:16.790737 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:16.790719 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" Apr 22 18:46:16.792906 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:16.792888 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-rwc46\"" Apr 22 18:46:16.793015 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:16.792970 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 18:46:16.797701 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:16.797479 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-qmgx9"] Apr 22 18:46:16.823319 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:16.823292 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-qmgx9"] Apr 22 18:46:16.944663 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:16.944634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpgrj\" (UniqueName: \"kubernetes.io/projected/6c3bc178-8c72-49a8-bffd-ae035016673f-kube-api-access-bpgrj\") pod \"limitador-limitador-67566c68b4-qmgx9\" (UID: \"6c3bc178-8c72-49a8-bffd-ae035016673f\") " pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" Apr 22 18:46:16.944808 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:16.944715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6c3bc178-8c72-49a8-bffd-ae035016673f-config-file\") pod \"limitador-limitador-67566c68b4-qmgx9\" (UID: \"6c3bc178-8c72-49a8-bffd-ae035016673f\") " pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" Apr 22 18:46:17.045986 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:17.045914 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6c3bc178-8c72-49a8-bffd-ae035016673f-config-file\") pod \"limitador-limitador-67566c68b4-qmgx9\" (UID: \"6c3bc178-8c72-49a8-bffd-ae035016673f\") " pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" Apr 22 18:46:17.045986 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:17.045976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpgrj\" (UniqueName: \"kubernetes.io/projected/6c3bc178-8c72-49a8-bffd-ae035016673f-kube-api-access-bpgrj\") pod \"limitador-limitador-67566c68b4-qmgx9\" (UID: \"6c3bc178-8c72-49a8-bffd-ae035016673f\") " pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" Apr 22 18:46:17.046497 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:17.046481 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6c3bc178-8c72-49a8-bffd-ae035016673f-config-file\") pod \"limitador-limitador-67566c68b4-qmgx9\" (UID: \"6c3bc178-8c72-49a8-bffd-ae035016673f\") " pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" Apr 22 18:46:17.056025 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:17.056000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpgrj\" (UniqueName: \"kubernetes.io/projected/6c3bc178-8c72-49a8-bffd-ae035016673f-kube-api-access-bpgrj\") pod \"limitador-limitador-67566c68b4-qmgx9\" (UID: \"6c3bc178-8c72-49a8-bffd-ae035016673f\") " pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" Apr 22 18:46:17.101069 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:17.101044 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" Apr 22 18:46:17.219948 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:17.219918 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-qmgx9"] Apr 22 18:46:17.223250 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:46:17.223185 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c3bc178_8c72_49a8_bffd_ae035016673f.slice/crio-84f55e15a9eaf71e0f875977d39ffa96dfb23125491e2ea54beb85105747c629 WatchSource:0}: Error finding container 84f55e15a9eaf71e0f875977d39ffa96dfb23125491e2ea54beb85105747c629: Status 404 returned error can't find the container with id 84f55e15a9eaf71e0f875977d39ffa96dfb23125491e2ea54beb85105747c629 Apr 22 18:46:17.477073 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:17.477029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" event={"ID":"6c3bc178-8c72-49a8-bffd-ae035016673f","Type":"ContainerStarted","Data":"84f55e15a9eaf71e0f875977d39ffa96dfb23125491e2ea54beb85105747c629"} Apr 22 18:46:21.493716 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:21.493678 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" event={"ID":"6c3bc178-8c72-49a8-bffd-ae035016673f","Type":"ContainerStarted","Data":"275dc33a250ce90409cde4c6ef59e2695909eb0b4ecb19e2504492ec792e50cf"} Apr 22 18:46:21.494168 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:21.493734 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" Apr 22 18:46:21.513876 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:21.513831 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" podStartSLOduration=1.717180327 podStartE2EDuration="5.513817787s" podCreationTimestamp="2026-04-22 18:46:16 +0000 UTC" firstStartedPulling="2026-04-22 18:46:17.225086773 +0000 UTC m=+531.829240755" lastFinishedPulling="2026-04-22 18:46:21.02172423 +0000 UTC m=+535.625878215" observedRunningTime="2026-04-22 18:46:21.511517283 +0000 UTC m=+536.115671288" watchObservedRunningTime="2026-04-22 18:46:21.513817787 +0000 UTC m=+536.117971791" Apr 22 18:46:32.498084 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:46:32.498053 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-qmgx9" Apr 22 18:47:25.802648 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:47:25.802614 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:47:25.804175 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:47:25.804151 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:47:25.807065 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:47:25.807047 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:47:25.808361 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:47:25.808343 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:49:03.790507 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.790471 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7"] Apr 22 18:49:03.793680 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.793666 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:03.797067 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.796996 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:49:03.797067 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.797061 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:49:03.797067 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.797066 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 18:49:03.797319 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.797066 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 18:49:03.804176 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.804153 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7"] Apr 22 18:49:03.981443 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.981416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwzk9\" (UniqueName: \"kubernetes.io/projected/d47314fd-f20b-42f6-9f06-360a1ce56554-kube-api-access-wwzk9\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:03.981589 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.981446 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:03.981589 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.981481 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-home\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:03.981589 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.981551 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-model-cache\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:03.981589 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.981580 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-dshm\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:03.981743 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:03.981598 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d47314fd-f20b-42f6-9f06-360a1ce56554-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.082741 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.082714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-home\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.082896 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.082754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-model-cache\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.082896 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.082771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-dshm\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.082896 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.082789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d47314fd-f20b-42f6-9f06-360a1ce56554-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.082896 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.082836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwzk9\" (UniqueName: \"kubernetes.io/projected/d47314fd-f20b-42f6-9f06-360a1ce56554-kube-api-access-wwzk9\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.082896 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.082852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.083173 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.083065 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-home\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.083173 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.083154 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-model-cache\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.083274 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.083259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.085063 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.085043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-dshm\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.085433 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.085416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d47314fd-f20b-42f6-9f06-360a1ce56554-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.092614 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.092589 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwzk9\" (UniqueName: \"kubernetes.io/projected/d47314fd-f20b-42f6-9f06-360a1ce56554-kube-api-access-wwzk9\") pod \"scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.106258 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.106242 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:04.226261 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.226237 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7"] Apr 22 18:49:04.228779 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:49:04.228752 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd47314fd_f20b_42f6_9f06_360a1ce56554.slice/crio-4b49721ba9f36eb9be427e3b6f781d8fdb4b793212a1672f1e4336439308c692 WatchSource:0}: Error finding container 4b49721ba9f36eb9be427e3b6f781d8fdb4b793212a1672f1e4336439308c692: Status 404 returned error can't find the container with id 4b49721ba9f36eb9be427e3b6f781d8fdb4b793212a1672f1e4336439308c692 Apr 22 18:49:04.230534 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:04.230517 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:49:05.025867 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:05.025825 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" event={"ID":"d47314fd-f20b-42f6-9f06-360a1ce56554","Type":"ContainerStarted","Data":"4b49721ba9f36eb9be427e3b6f781d8fdb4b793212a1672f1e4336439308c692"} Apr 22 18:49:09.042934 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:09.042895 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" event={"ID":"d47314fd-f20b-42f6-9f06-360a1ce56554","Type":"ContainerStarted","Data":"76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1"} Apr 22 18:49:13.058845 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:13.058810 2572 generic.go:358] "Generic (PLEG): container finished" podID="d47314fd-f20b-42f6-9f06-360a1ce56554" containerID="76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1" exitCode=0 Apr 22 18:49:13.059239 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:13.058855 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" event={"ID":"d47314fd-f20b-42f6-9f06-360a1ce56554","Type":"ContainerDied","Data":"76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1"} Apr 22 18:49:15.070288 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:15.070244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" event={"ID":"d47314fd-f20b-42f6-9f06-360a1ce56554","Type":"ContainerStarted","Data":"79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c"} Apr 22 18:49:15.091665 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:15.091611 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" podStartSLOduration=1.957899684 podStartE2EDuration="12.091597229s" podCreationTimestamp="2026-04-22 18:49:03 +0000 UTC" firstStartedPulling="2026-04-22 18:49:04.230644882 +0000 UTC m=+698.834798864" lastFinishedPulling="2026-04-22 18:49:14.364342425 +0000 UTC m=+708.968496409" observedRunningTime="2026-04-22 18:49:15.08969757 +0000 UTC m=+709.693851573" watchObservedRunningTime="2026-04-22 18:49:15.091597229 +0000 UTC m=+709.695751232" Apr 22 18:49:24.107111 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:24.107075 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:24.107111 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:24.107118 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:24.119774 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:24.119746 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:49:25.112601 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:49:25.112570 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:51:59.809173 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:51:59.803905 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7"] Apr 22 18:51:59.809173 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:51:59.804382 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" podUID="d47314fd-f20b-42f6-9f06-360a1ce56554" containerName="main" containerID="cri-o://79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c" gracePeriod=30 Apr 22 18:52:00.047427 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.047406 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:52:00.102804 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.102777 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-kserve-provision-location\") pod \"d47314fd-f20b-42f6-9f06-360a1ce56554\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " Apr 22 18:52:00.102990 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.102832 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwzk9\" (UniqueName: \"kubernetes.io/projected/d47314fd-f20b-42f6-9f06-360a1ce56554-kube-api-access-wwzk9\") pod \"d47314fd-f20b-42f6-9f06-360a1ce56554\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " Apr 22 18:52:00.102990 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.102864 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-model-cache\") pod \"d47314fd-f20b-42f6-9f06-360a1ce56554\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " Apr 22 18:52:00.102990 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.102899 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d47314fd-f20b-42f6-9f06-360a1ce56554-tls-certs\") pod \"d47314fd-f20b-42f6-9f06-360a1ce56554\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " Apr 22 18:52:00.102990 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.102960 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-dshm\") pod \"d47314fd-f20b-42f6-9f06-360a1ce56554\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " Apr 22 18:52:00.103225 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.102993 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-home\") pod \"d47314fd-f20b-42f6-9f06-360a1ce56554\" (UID: \"d47314fd-f20b-42f6-9f06-360a1ce56554\") " Apr 22 18:52:00.103225 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.103162 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-model-cache" (OuterVolumeSpecName: "model-cache") pod "d47314fd-f20b-42f6-9f06-360a1ce56554" (UID: "d47314fd-f20b-42f6-9f06-360a1ce56554"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:00.103327 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.103258 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-model-cache\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:00.103401 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.103380 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-home" (OuterVolumeSpecName: "home") pod "d47314fd-f20b-42f6-9f06-360a1ce56554" (UID: "d47314fd-f20b-42f6-9f06-360a1ce56554"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:00.105794 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.105766 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-dshm" (OuterVolumeSpecName: "dshm") pod "d47314fd-f20b-42f6-9f06-360a1ce56554" (UID: "d47314fd-f20b-42f6-9f06-360a1ce56554"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:00.105918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.105780 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d47314fd-f20b-42f6-9f06-360a1ce56554-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d47314fd-f20b-42f6-9f06-360a1ce56554" (UID: "d47314fd-f20b-42f6-9f06-360a1ce56554"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:52:00.105918 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.105864 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d47314fd-f20b-42f6-9f06-360a1ce56554-kube-api-access-wwzk9" (OuterVolumeSpecName: "kube-api-access-wwzk9") pod "d47314fd-f20b-42f6-9f06-360a1ce56554" (UID: "d47314fd-f20b-42f6-9f06-360a1ce56554"). InnerVolumeSpecName "kube-api-access-wwzk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:00.158809 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.158772 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d47314fd-f20b-42f6-9f06-360a1ce56554" (UID: "d47314fd-f20b-42f6-9f06-360a1ce56554"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:00.204064 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.204040 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-home\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:00.204064 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.204063 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-kserve-provision-location\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:00.204237 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.204074 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwzk9\" (UniqueName: \"kubernetes.io/projected/d47314fd-f20b-42f6-9f06-360a1ce56554-kube-api-access-wwzk9\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:00.204237 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.204084 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d47314fd-f20b-42f6-9f06-360a1ce56554-tls-certs\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:00.204237 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.204093 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d47314fd-f20b-42f6-9f06-360a1ce56554-dshm\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:00.605709 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.605676 2572 generic.go:358] "Generic (PLEG): container finished" podID="d47314fd-f20b-42f6-9f06-360a1ce56554" containerID="79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c" exitCode=0 Apr 22 18:52:00.605855 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.605743 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" Apr 22 18:52:00.605855 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.605742 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" event={"ID":"d47314fd-f20b-42f6-9f06-360a1ce56554","Type":"ContainerDied","Data":"79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c"} Apr 22 18:52:00.605855 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.605846 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7" event={"ID":"d47314fd-f20b-42f6-9f06-360a1ce56554","Type":"ContainerDied","Data":"4b49721ba9f36eb9be427e3b6f781d8fdb4b793212a1672f1e4336439308c692"} Apr 22 18:52:00.605968 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.605861 2572 scope.go:117] "RemoveContainer" containerID="79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c" Apr 22 18:52:00.614981 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.614966 2572 scope.go:117] "RemoveContainer" containerID="76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1" Apr 22 18:52:00.629543 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.629527 2572 scope.go:117] "RemoveContainer" containerID="79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c" Apr 22 18:52:00.629824 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:00.629792 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c\": container with ID starting with 79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c not found: ID does not exist" containerID="79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c" Apr 22 18:52:00.629874 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.629830 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c"} err="failed to get container status \"79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c\": rpc error: code = NotFound desc = could not find container \"79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c\": container with ID starting with 79e089628cb8573d5c804509071ce61d63cdfff8d0bbd13934a1c156ed836f2c not found: ID does not exist" Apr 22 18:52:00.629874 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.629840 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7"] Apr 22 18:52:00.629874 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.629847 2572 scope.go:117] "RemoveContainer" containerID="76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1" Apr 22 18:52:00.630094 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:00.630080 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1\": container with ID starting with 76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1 not found: ID does not exist" containerID="76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1" Apr 22 18:52:00.630148 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.630098 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1"} err="failed to get container status \"76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1\": rpc error: code = NotFound desc = could not find container \"76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1\": container with ID starting with 76fd3e2bc7fe843f8d950b73f022ea8e73155541de95f9897fe99b5261806bf1 not found: ID does not exist" Apr 22 18:52:00.633586 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:00.633561 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-c9b66b987-8v4x7"] Apr 22 18:52:01.871072 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:01.871039 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d47314fd-f20b-42f6-9f06-360a1ce56554" path="/var/lib/kubelet/pods/d47314fd-f20b-42f6-9f06-360a1ce56554/volumes" Apr 22 18:52:07.401730 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.401700 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27"] Apr 22 18:52:07.402209 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.402023 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d47314fd-f20b-42f6-9f06-360a1ce56554" containerName="storage-initializer" Apr 22 18:52:07.402209 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.402034 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47314fd-f20b-42f6-9f06-360a1ce56554" containerName="storage-initializer" Apr 22 18:52:07.402209 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.402046 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d47314fd-f20b-42f6-9f06-360a1ce56554" containerName="main" Apr 22 18:52:07.402209 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.402052 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47314fd-f20b-42f6-9f06-360a1ce56554" containerName="main" Apr 22 18:52:07.402209 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.402094 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d47314fd-f20b-42f6-9f06-360a1ce56554" containerName="main" Apr 22 18:52:07.404965 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.404946 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.407427 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.407402 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:52:07.408321 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.408299 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 18:52:07.408428 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.408362 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 18:52:07.408428 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.408365 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:52:07.416342 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.416321 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27"] Apr 22 18:52:07.455603 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.455572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-dshm\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.455714 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.455631 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e245cf85-9dc4-4765-8036-346dfd62fb62-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.455714 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.455675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-model-cache\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.455788 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.455710 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.455788 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.455754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88ph\" (UniqueName: \"kubernetes.io/projected/e245cf85-9dc4-4765-8036-346dfd62fb62-kube-api-access-d88ph\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.455788 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.455772 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-home\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.556704 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.556663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e245cf85-9dc4-4765-8036-346dfd62fb62-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.556704 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.556715 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-model-cache\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.556899 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.556755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.556899 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.556778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d88ph\" (UniqueName: \"kubernetes.io/projected/e245cf85-9dc4-4765-8036-346dfd62fb62-kube-api-access-d88ph\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.556899 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.556798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-home\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.556899 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.556828 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-dshm\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.557293 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.557259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.557293 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.557281 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-home\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.557440 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.557266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-model-cache\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.559219 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.559190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-dshm\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.559445 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.559427 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e245cf85-9dc4-4765-8036-346dfd62fb62-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.565791 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.565769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88ph\" (UniqueName: \"kubernetes.io/projected/e245cf85-9dc4-4765-8036-346dfd62fb62-kube-api-access-d88ph\") pod \"scheduler-ha-replicas-test-kserve-6b99756559-m9f27\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.717147 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.717038 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:07.724426 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.724399 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9"] Apr 22 18:52:07.727703 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.727679 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.730019 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.729997 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-9ld8c\"" Apr 22 18:52:07.738019 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.737916 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9"] Apr 22 18:52:07.758571 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.758544 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.758717 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.758588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.758717 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.758648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsqh4\" (UniqueName: \"kubernetes.io/projected/09a62540-6276-46a8-a27f-34f866de3566-kube-api-access-jsqh4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.758717 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.758680 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.758717 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.758710 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.758920 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.758825 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.844095 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.844072 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27"] Apr 22 18:52:07.846047 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:52:07.846014 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode245cf85_9dc4_4765_8036_346dfd62fb62.slice/crio-913294b725b3bbb320726992cc48b0490ac6ff0a9a7f698673861d7980c4f360 WatchSource:0}: Error finding container 913294b725b3bbb320726992cc48b0490ac6ff0a9a7f698673861d7980c4f360: Status 404 returned error can't find the container with id 913294b725b3bbb320726992cc48b0490ac6ff0a9a7f698673861d7980c4f360 Apr 22 18:52:07.859231 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.859203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.859362 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.859249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsqh4\" (UniqueName: \"kubernetes.io/projected/09a62540-6276-46a8-a27f-34f866de3566-kube-api-access-jsqh4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.859362 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.859279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.859362 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.859303 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.859525 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.859388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.859525 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.859425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.859658 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.859636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.859713 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.859680 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.859713 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.859704 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.859850 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.859771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.861903 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.861882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:07.868113 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:07.868091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsqh4\" (UniqueName: \"kubernetes.io/projected/09a62540-6276-46a8-a27f-34f866de3566-kube-api-access-jsqh4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:08.047250 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:08.047170 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:08.173177 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:08.173145 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9"] Apr 22 18:52:08.176445 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:52:08.176419 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a62540_6276_46a8_a27f_34f866de3566.slice/crio-80428867cbc6f883166bb3f79124bfb13fdde0d49736d6fc28014d0da87e7123 WatchSource:0}: Error finding container 80428867cbc6f883166bb3f79124bfb13fdde0d49736d6fc28014d0da87e7123: Status 404 returned error can't find the container with id 80428867cbc6f883166bb3f79124bfb13fdde0d49736d6fc28014d0da87e7123 Apr 22 18:52:08.635023 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:08.634986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" event={"ID":"e245cf85-9dc4-4765-8036-346dfd62fb62","Type":"ContainerStarted","Data":"1a4072525bc27981cfbeda95a0772c67cb7ffc9f4b287a488eeb9b10aac4fae7"} Apr 22 18:52:08.635495 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:08.635048 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" event={"ID":"e245cf85-9dc4-4765-8036-346dfd62fb62","Type":"ContainerStarted","Data":"913294b725b3bbb320726992cc48b0490ac6ff0a9a7f698673861d7980c4f360"} Apr 22 18:52:08.636533 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:08.636509 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" event={"ID":"09a62540-6276-46a8-a27f-34f866de3566","Type":"ContainerStarted","Data":"000e8f81a09f8fe3384575b221b1b52ace63daa8a0b4598128dca38b110eacff"} Apr 22 18:52:08.636533 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:08.636537 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" event={"ID":"09a62540-6276-46a8-a27f-34f866de3566","Type":"ContainerStarted","Data":"80428867cbc6f883166bb3f79124bfb13fdde0d49736d6fc28014d0da87e7123"} Apr 22 18:52:09.642145 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:09.642080 2572 generic.go:358] "Generic (PLEG): container finished" podID="09a62540-6276-46a8-a27f-34f866de3566" containerID="000e8f81a09f8fe3384575b221b1b52ace63daa8a0b4598128dca38b110eacff" exitCode=0 Apr 22 18:52:09.642612 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:09.642190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" event={"ID":"09a62540-6276-46a8-a27f-34f866de3566","Type":"ContainerDied","Data":"000e8f81a09f8fe3384575b221b1b52ace63daa8a0b4598128dca38b110eacff"} Apr 22 18:52:11.651848 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:11.651807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" event={"ID":"09a62540-6276-46a8-a27f-34f866de3566","Type":"ContainerStarted","Data":"28ce75ea12b162858066ce0d90b884316aed4b329cd9c1e0873d7fc6f71bcd38"} Apr 22 18:52:12.657595 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:12.657554 2572 generic.go:358] "Generic (PLEG): container finished" podID="e245cf85-9dc4-4765-8036-346dfd62fb62" containerID="1a4072525bc27981cfbeda95a0772c67cb7ffc9f4b287a488eeb9b10aac4fae7" exitCode=0 Apr 22 18:52:12.657595 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:12.657588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" event={"ID":"e245cf85-9dc4-4765-8036-346dfd62fb62","Type":"ContainerDied","Data":"1a4072525bc27981cfbeda95a0772c67cb7ffc9f4b287a488eeb9b10aac4fae7"} Apr 22 18:52:13.663379 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:13.663328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" event={"ID":"e245cf85-9dc4-4765-8036-346dfd62fb62","Type":"ContainerStarted","Data":"03003044ca5be0974f16b558d1efa3294b9ce7d2a9611cfbea6f92d7b6c673b0"} Apr 22 18:52:13.683920 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:13.683865 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" podStartSLOduration=6.683848843 podStartE2EDuration="6.683848843s" podCreationTimestamp="2026-04-22 18:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:52:13.682191158 +0000 UTC m=+888.286345163" watchObservedRunningTime="2026-04-22 18:52:13.683848843 +0000 UTC m=+888.288002847" Apr 22 18:52:17.717481 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:17.717396 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:17.717481 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:17.717441 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:17.732708 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:17.732681 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:18.694377 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:18.694348 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:40.151636 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.151611 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:52:40.152010 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.151987 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:52:40.156245 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.156222 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:52:40.156430 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.156413 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:52:40.601641 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.601607 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9"] Apr 22 18:52:40.603783 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.603758 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27"] Apr 22 18:52:40.604195 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.604115 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" podUID="e245cf85-9dc4-4765-8036-346dfd62fb62" containerName="main" containerID="cri-o://03003044ca5be0974f16b558d1efa3294b9ce7d2a9611cfbea6f92d7b6c673b0" gracePeriod=30 Apr 22 18:52:40.764641 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.764602 2572 generic.go:358] "Generic (PLEG): container finished" podID="e245cf85-9dc4-4765-8036-346dfd62fb62" containerID="03003044ca5be0974f16b558d1efa3294b9ce7d2a9611cfbea6f92d7b6c673b0" exitCode=0 Apr 22 18:52:40.764810 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.764641 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" event={"ID":"e245cf85-9dc4-4765-8036-346dfd62fb62","Type":"ContainerDied","Data":"03003044ca5be0974f16b558d1efa3294b9ce7d2a9611cfbea6f92d7b6c673b0"} Apr 22 18:52:40.766696 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.766672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" event={"ID":"09a62540-6276-46a8-a27f-34f866de3566","Type":"ContainerStarted","Data":"b0bd7efa72cfa48a3043c629604b425f7df502cabb96ee10f91bdf46d23e76fa"} Apr 22 18:52:40.767002 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.766967 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:40.767116 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.767038 2572 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" secret="" err="secret \"scheduler-ha-replicas-test-epp-sa-dockercfg-9ld8c\" not found" Apr 22 18:52:40.769887 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.769860 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 18:52:40.788530 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.788484 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" podStartSLOduration=3.205529089 podStartE2EDuration="33.788472638s" podCreationTimestamp="2026-04-22 18:52:07 +0000 UTC" firstStartedPulling="2026-04-22 18:52:09.643648957 +0000 UTC m=+884.247802950" lastFinishedPulling="2026-04-22 18:52:40.226592516 +0000 UTC m=+914.830746499" observedRunningTime="2026-04-22 18:52:40.785623124 +0000 UTC m=+915.389777128" watchObservedRunningTime="2026-04-22 18:52:40.788472638 +0000 UTC m=+915.392626636" Apr 22 18:52:40.859242 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.859181 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:40.864302 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.864281 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-dshm\") pod \"e245cf85-9dc4-4765-8036-346dfd62fb62\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " Apr 22 18:52:40.864430 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.864331 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-model-cache\") pod \"e245cf85-9dc4-4765-8036-346dfd62fb62\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " Apr 22 18:52:40.864430 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.864354 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-home\") pod \"e245cf85-9dc4-4765-8036-346dfd62fb62\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " Apr 22 18:52:40.864430 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.864390 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-kserve-provision-location\") pod \"e245cf85-9dc4-4765-8036-346dfd62fb62\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " Apr 22 18:52:40.864430 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.864420 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d88ph\" (UniqueName: \"kubernetes.io/projected/e245cf85-9dc4-4765-8036-346dfd62fb62-kube-api-access-d88ph\") pod \"e245cf85-9dc4-4765-8036-346dfd62fb62\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " Apr 22 18:52:40.864636 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.864457 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e245cf85-9dc4-4765-8036-346dfd62fb62-tls-certs\") pod \"e245cf85-9dc4-4765-8036-346dfd62fb62\" (UID: \"e245cf85-9dc4-4765-8036-346dfd62fb62\") " Apr 22 18:52:40.864700 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.864654 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-model-cache" (OuterVolumeSpecName: "model-cache") pod "e245cf85-9dc4-4765-8036-346dfd62fb62" (UID: "e245cf85-9dc4-4765-8036-346dfd62fb62"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:40.865183 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.864705 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-home" (OuterVolumeSpecName: "home") pod "e245cf85-9dc4-4765-8036-346dfd62fb62" (UID: "e245cf85-9dc4-4765-8036-346dfd62fb62"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:40.865269 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:40.864798 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:40.865269 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:40.865255 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs podName:09a62540-6276-46a8-a27f-34f866de3566 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:41.365233752 +0000 UTC m=+915.969387733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" (UID: "09a62540-6276-46a8-a27f-34f866de3566") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:40.866703 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.866674 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-dshm" (OuterVolumeSpecName: "dshm") pod "e245cf85-9dc4-4765-8036-346dfd62fb62" (UID: "e245cf85-9dc4-4765-8036-346dfd62fb62"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:40.866953 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.866921 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e245cf85-9dc4-4765-8036-346dfd62fb62-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e245cf85-9dc4-4765-8036-346dfd62fb62" (UID: "e245cf85-9dc4-4765-8036-346dfd62fb62"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:52:40.867189 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.867172 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e245cf85-9dc4-4765-8036-346dfd62fb62-kube-api-access-d88ph" (OuterVolumeSpecName: "kube-api-access-d88ph") pod "e245cf85-9dc4-4765-8036-346dfd62fb62" (UID: "e245cf85-9dc4-4765-8036-346dfd62fb62"). InnerVolumeSpecName "kube-api-access-d88ph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:40.922893 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.922848 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e245cf85-9dc4-4765-8036-346dfd62fb62" (UID: "e245cf85-9dc4-4765-8036-346dfd62fb62"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:40.965828 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.965806 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e245cf85-9dc4-4765-8036-346dfd62fb62-tls-certs\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:40.965828 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.965829 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-dshm\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:40.965952 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.965837 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-model-cache\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:40.965952 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.965845 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-home\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:40.965952 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.965853 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e245cf85-9dc4-4765-8036-346dfd62fb62-kserve-provision-location\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:40.965952 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:40.965861 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d88ph\" (UniqueName: \"kubernetes.io/projected/e245cf85-9dc4-4765-8036-346dfd62fb62-kube-api-access-d88ph\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:52:41.370049 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:41.370016 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:41.370403 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:41.370089 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs podName:09a62540-6276-46a8-a27f-34f866de3566 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:42.370074576 +0000 UTC m=+916.974228558 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" (UID: "09a62540-6276-46a8-a27f-34f866de3566") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:41.771075 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:41.770984 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" Apr 22 18:52:41.771075 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:41.771002 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27" event={"ID":"e245cf85-9dc4-4765-8036-346dfd62fb62","Type":"ContainerDied","Data":"913294b725b3bbb320726992cc48b0490ac6ff0a9a7f698673861d7980c4f360"} Apr 22 18:52:41.771075 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:41.771048 2572 scope.go:117] "RemoveContainer" containerID="03003044ca5be0974f16b558d1efa3294b9ce7d2a9611cfbea6f92d7b6c673b0" Apr 22 18:52:41.771327 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:41.771170 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="main" containerID="cri-o://28ce75ea12b162858066ce0d90b884316aed4b329cd9c1e0873d7fc6f71bcd38" gracePeriod=30 Apr 22 18:52:41.771327 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:41.771210 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="tokenizer" containerID="cri-o://b0bd7efa72cfa48a3043c629604b425f7df502cabb96ee10f91bdf46d23e76fa" gracePeriod=30 Apr 22 18:52:41.781526 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:41.781494 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 18:52:41.789696 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:41.789675 2572 scope.go:117] "RemoveContainer" containerID="1a4072525bc27981cfbeda95a0772c67cb7ffc9f4b287a488eeb9b10aac4fae7" Apr 22 18:52:41.798747 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:41.798725 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27"] Apr 22 18:52:41.801848 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:41.801825 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6b99756559-m9f27"] Apr 22 18:52:41.868165 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:41.868108 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e245cf85-9dc4-4765-8036-346dfd62fb62" path="/var/lib/kubelet/pods/e245cf85-9dc4-4765-8036-346dfd62fb62/volumes" Apr 22 18:52:42.380320 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:42.380294 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:42.380664 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:42.380370 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs podName:09a62540-6276-46a8-a27f-34f866de3566 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:44.380354693 +0000 UTC m=+918.984508678 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" (UID: "09a62540-6276-46a8-a27f-34f866de3566") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:42.776764 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:42.776693 2572 generic.go:358] "Generic (PLEG): container finished" podID="09a62540-6276-46a8-a27f-34f866de3566" containerID="28ce75ea12b162858066ce0d90b884316aed4b329cd9c1e0873d7fc6f71bcd38" exitCode=0 Apr 22 18:52:42.776764 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:42.776748 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" event={"ID":"09a62540-6276-46a8-a27f-34f866de3566","Type":"ContainerDied","Data":"28ce75ea12b162858066ce0d90b884316aed4b329cd9c1e0873d7fc6f71bcd38"} Apr 22 18:52:44.398199 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:44.398165 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:44.398624 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:44.398233 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs podName:09a62540-6276-46a8-a27f-34f866de3566 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:48.398218271 +0000 UTC m=+923.002372252 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" (UID: "09a62540-6276-46a8-a27f-34f866de3566") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:48.047383 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:48.047347 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:52:48.435219 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:48.435192 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:48.435369 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:48.435271 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs podName:09a62540-6276-46a8-a27f-34f866de3566 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:56.43525516 +0000 UTC m=+931.039409147 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" (UID: "09a62540-6276-46a8-a27f-34f866de3566") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:49.911257 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:49.911223 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp"] Apr 22 18:52:49.911606 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:49.911569 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e245cf85-9dc4-4765-8036-346dfd62fb62" containerName="storage-initializer" Apr 22 18:52:49.911606 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:49.911580 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e245cf85-9dc4-4765-8036-346dfd62fb62" containerName="storage-initializer" Apr 22 18:52:49.911606 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:49.911589 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e245cf85-9dc4-4765-8036-346dfd62fb62" containerName="main" Apr 22 18:52:49.911606 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:49.911594 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e245cf85-9dc4-4765-8036-346dfd62fb62" containerName="main" Apr 22 18:52:49.911734 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:49.911660 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e245cf85-9dc4-4765-8036-346dfd62fb62" containerName="main" Apr 22 18:52:49.919605 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:49.919585 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:49.921749 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:49.921727 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 18:52:49.922610 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:49.922586 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp"] Apr 22 18:52:50.047543 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.047511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-model-cache\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.047690 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.047548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-dshm\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.047690 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.047573 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05b61c70-2187-40ea-ad4f-7797fa18d564-tls-certs\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.047690 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.047601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.047690 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.047640 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-home\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.047827 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.047728 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd29j\" (UniqueName: \"kubernetes.io/projected/05b61c70-2187-40ea-ad4f-7797fa18d564-kube-api-access-hd29j\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.148485 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.148457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-home\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.148593 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.148502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd29j\" (UniqueName: \"kubernetes.io/projected/05b61c70-2187-40ea-ad4f-7797fa18d564-kube-api-access-hd29j\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.148593 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.148539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-model-cache\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.148593 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.148559 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-dshm\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.148593 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.148574 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05b61c70-2187-40ea-ad4f-7797fa18d564-tls-certs\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.148593 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.148590 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.148899 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.148877 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-home\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.149033 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.149008 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-model-cache\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.149149 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.149034 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.151158 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.151112 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-dshm\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.151445 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.151424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05b61c70-2187-40ea-ad4f-7797fa18d564-tls-certs\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.161194 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.161113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd29j\" (UniqueName: \"kubernetes.io/projected/05b61c70-2187-40ea-ad4f-7797fa18d564-kube-api-access-hd29j\") pod \"precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.230585 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.230511 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:52:50.350712 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.350541 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp"] Apr 22 18:52:50.353411 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:52:50.353382 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05b61c70_2187_40ea_ad4f_7797fa18d564.slice/crio-a692f6ae7160c3879da2d1d6f0604e52c561c71ec868e9637b91d2cdfd761c3e WatchSource:0}: Error finding container a692f6ae7160c3879da2d1d6f0604e52c561c71ec868e9637b91d2cdfd761c3e: Status 404 returned error can't find the container with id a692f6ae7160c3879da2d1d6f0604e52c561c71ec868e9637b91d2cdfd761c3e Apr 22 18:52:50.805622 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.805581 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" event={"ID":"05b61c70-2187-40ea-ad4f-7797fa18d564","Type":"ContainerStarted","Data":"46768acba1da8f3baa2d2b802aae6d0fd6383da871e279a86a366d475cb65572"} Apr 22 18:52:50.805622 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:50.805622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" event={"ID":"05b61c70-2187-40ea-ad4f-7797fa18d564","Type":"ContainerStarted","Data":"a692f6ae7160c3879da2d1d6f0604e52c561c71ec868e9637b91d2cdfd761c3e"} Apr 22 18:52:51.771889 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:52:51.771837 2572 logging.go:55] [core] [Channel #23 SubChannel #24]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.26:9003", ServerName: "10.132.0.26:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.26:9003: connect: connection refused" Apr 22 18:52:52.772075 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:52.772037 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.26:9003\" within 1s: context deadline exceeded" Apr 22 18:52:52.772478 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:52:52.772201 2572 logging.go:55] [core] [Channel #23 SubChannel #24]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.26:9003", ServerName: "10.132.0.26:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.26:9003: connect: connection refused" Apr 22 18:52:54.825175 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:54.825144 2572 generic.go:358] "Generic (PLEG): container finished" podID="05b61c70-2187-40ea-ad4f-7797fa18d564" containerID="46768acba1da8f3baa2d2b802aae6d0fd6383da871e279a86a366d475cb65572" exitCode=0 Apr 22 18:52:54.825482 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:54.825187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" event={"ID":"05b61c70-2187-40ea-ad4f-7797fa18d564","Type":"ContainerDied","Data":"46768acba1da8f3baa2d2b802aae6d0fd6383da871e279a86a366d475cb65572"} Apr 22 18:52:55.830366 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:55.830329 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" event={"ID":"05b61c70-2187-40ea-ad4f-7797fa18d564","Type":"ContainerStarted","Data":"a995e353381472a663f31433a9cc0a002a1a2b0dd4a65567beeee603b866267c"} Apr 22 18:52:55.851459 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:52:55.851400 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" podStartSLOduration=6.851382862 podStartE2EDuration="6.851382862s" podCreationTimestamp="2026-04-22 18:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:52:55.850460877 +0000 UTC m=+930.454614882" watchObservedRunningTime="2026-04-22 18:52:55.851382862 +0000 UTC m=+930.455536867" Apr 22 18:52:56.498901 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:56.498872 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:52:56.499081 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:52:56.498933 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs podName:09a62540-6276-46a8-a27f-34f866de3566 nodeName:}" failed. No retries permitted until 2026-04-22 18:53:12.498918173 +0000 UTC m=+947.103072155 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" (UID: "09a62540-6276-46a8-a27f-34f866de3566") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 22 18:53:00.231438 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:00.231405 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:53:00.231438 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:00.231441 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:53:00.243997 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:00.243972 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:53:00.859202 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:00.859173 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:53:01.772170 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:53:01.772144 2572 logging.go:55] [core] [Channel #25 SubChannel #26]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.26:9003", ServerName: "10.132.0.26:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.26:9003: connect: connection refused" Apr 22 18:53:02.772647 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:02.772608 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.26:9003\" within 1s: context deadline exceeded" Apr 22 18:53:02.773006 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:53:02.772685 2572 logging.go:55] [core] [Channel #25 SubChannel #26]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.26:9003", ServerName: "10.132.0.26:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.26:9003: operation was canceled" Apr 22 18:53:11.771807 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:53:11.771776 2572 logging.go:55] [core] [Channel #27 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.26:9003", ServerName: "10.132.0.26:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.26:9003: connect: connection refused" Apr 22 18:53:11.887250 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:11.887229 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9_09a62540-6276-46a8-a27f-34f866de3566/tokenizer/0.log" Apr 22 18:53:11.887803 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:11.887778 2572 generic.go:358] "Generic (PLEG): container finished" podID="09a62540-6276-46a8-a27f-34f866de3566" containerID="b0bd7efa72cfa48a3043c629604b425f7df502cabb96ee10f91bdf46d23e76fa" exitCode=137 Apr 22 18:53:11.887875 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:11.887809 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" event={"ID":"09a62540-6276-46a8-a27f-34f866de3566","Type":"ContainerDied","Data":"b0bd7efa72cfa48a3043c629604b425f7df502cabb96ee10f91bdf46d23e76fa"} Apr 22 18:53:12.429279 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.429256 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9_09a62540-6276-46a8-a27f-34f866de3566/tokenizer/0.log" Apr 22 18:53:12.429891 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.429877 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:53:12.529307 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.529276 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-tmp\") pod \"09a62540-6276-46a8-a27f-34f866de3566\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " Apr 22 18:53:12.529452 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.529348 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsqh4\" (UniqueName: \"kubernetes.io/projected/09a62540-6276-46a8-a27f-34f866de3566-kube-api-access-jsqh4\") pod \"09a62540-6276-46a8-a27f-34f866de3566\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " Apr 22 18:53:12.529452 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.529390 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-cache\") pod \"09a62540-6276-46a8-a27f-34f866de3566\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " Apr 22 18:53:12.529452 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.529427 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-kserve-provision-location\") pod \"09a62540-6276-46a8-a27f-34f866de3566\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " Apr 22 18:53:12.529452 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.529446 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-uds\") pod \"09a62540-6276-46a8-a27f-34f866de3566\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " Apr 22 18:53:12.529643 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.529508 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs\") pod \"09a62540-6276-46a8-a27f-34f866de3566\" (UID: \"09a62540-6276-46a8-a27f-34f866de3566\") " Apr 22 18:53:12.529733 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.529673 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "09a62540-6276-46a8-a27f-34f866de3566" (UID: "09a62540-6276-46a8-a27f-34f866de3566"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:12.529733 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.529713 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "09a62540-6276-46a8-a27f-34f866de3566" (UID: "09a62540-6276-46a8-a27f-34f866de3566"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:12.529903 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.529881 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "09a62540-6276-46a8-a27f-34f866de3566" (UID: "09a62540-6276-46a8-a27f-34f866de3566"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:12.530098 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.530077 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "09a62540-6276-46a8-a27f-34f866de3566" (UID: "09a62540-6276-46a8-a27f-34f866de3566"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:12.531652 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.531634 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a62540-6276-46a8-a27f-34f866de3566-kube-api-access-jsqh4" (OuterVolumeSpecName: "kube-api-access-jsqh4") pod "09a62540-6276-46a8-a27f-34f866de3566" (UID: "09a62540-6276-46a8-a27f-34f866de3566"). InnerVolumeSpecName "kube-api-access-jsqh4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:12.531813 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.531790 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "09a62540-6276-46a8-a27f-34f866de3566" (UID: "09a62540-6276-46a8-a27f-34f866de3566"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:53:12.630865 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.630842 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jsqh4\" (UniqueName: \"kubernetes.io/projected/09a62540-6276-46a8-a27f-34f866de3566-kube-api-access-jsqh4\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:12.630865 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.630864 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-cache\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:12.630999 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.630874 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-kserve-provision-location\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:12.630999 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.630883 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-uds\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:12.630999 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.630893 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/09a62540-6276-46a8-a27f-34f866de3566-tls-certs\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:12.630999 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.630900 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/09a62540-6276-46a8-a27f-34f866de3566-tokenizer-tmp\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:12.772468 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.772433 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.26:9003\" within 1s: context deadline exceeded" Apr 22 18:53:12.892231 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.892173 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9_09a62540-6276-46a8-a27f-34f866de3566/tokenizer/0.log" Apr 22 18:53:12.892869 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.892844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" event={"ID":"09a62540-6276-46a8-a27f-34f866de3566","Type":"ContainerDied","Data":"80428867cbc6f883166bb3f79124bfb13fdde0d49736d6fc28014d0da87e7123"} Apr 22 18:53:12.892960 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.892887 2572 scope.go:117] "RemoveContainer" containerID="b0bd7efa72cfa48a3043c629604b425f7df502cabb96ee10f91bdf46d23e76fa" Apr 22 18:53:12.892960 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.892896 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9" Apr 22 18:53:12.903338 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.903317 2572 scope.go:117] "RemoveContainer" containerID="28ce75ea12b162858066ce0d90b884316aed4b329cd9c1e0873d7fc6f71bcd38" Apr 22 18:53:12.911053 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.911038 2572 scope.go:117] "RemoveContainer" containerID="000e8f81a09f8fe3384575b221b1b52ace63daa8a0b4598128dca38b110eacff" Apr 22 18:53:12.918034 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.918014 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9"] Apr 22 18:53:12.923642 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:12.923619 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64767t9"] Apr 22 18:53:13.868949 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:13.868904 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a62540-6276-46a8-a27f-34f866de3566" path="/var/lib/kubelet/pods/09a62540-6276-46a8-a27f-34f866de3566/volumes" Apr 22 18:53:36.796324 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:36.796291 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp"] Apr 22 18:53:36.796785 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:36.796661 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" podUID="05b61c70-2187-40ea-ad4f-7797fa18d564" containerName="main" containerID="cri-o://a995e353381472a663f31433a9cc0a002a1a2b0dd4a65567beeee603b866267c" gracePeriod=30 Apr 22 18:53:36.972691 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:36.972662 2572 generic.go:358] "Generic (PLEG): container finished" podID="05b61c70-2187-40ea-ad4f-7797fa18d564" containerID="a995e353381472a663f31433a9cc0a002a1a2b0dd4a65567beeee603b866267c" exitCode=0 Apr 22 18:53:36.972826 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:36.972744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" event={"ID":"05b61c70-2187-40ea-ad4f-7797fa18d564","Type":"ContainerDied","Data":"a995e353381472a663f31433a9cc0a002a1a2b0dd4a65567beeee603b866267c"} Apr 22 18:53:37.051895 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.051846 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:53:37.214766 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.214734 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-model-cache\") pod \"05b61c70-2187-40ea-ad4f-7797fa18d564\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " Apr 22 18:53:37.214766 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.214764 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-dshm\") pod \"05b61c70-2187-40ea-ad4f-7797fa18d564\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " Apr 22 18:53:37.214980 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.214790 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-kserve-provision-location\") pod \"05b61c70-2187-40ea-ad4f-7797fa18d564\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " Apr 22 18:53:37.214980 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.214810 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-home\") pod \"05b61c70-2187-40ea-ad4f-7797fa18d564\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " Apr 22 18:53:37.214980 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.214833 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05b61c70-2187-40ea-ad4f-7797fa18d564-tls-certs\") pod \"05b61c70-2187-40ea-ad4f-7797fa18d564\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " Apr 22 18:53:37.214980 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.214865 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd29j\" (UniqueName: \"kubernetes.io/projected/05b61c70-2187-40ea-ad4f-7797fa18d564-kube-api-access-hd29j\") pod \"05b61c70-2187-40ea-ad4f-7797fa18d564\" (UID: \"05b61c70-2187-40ea-ad4f-7797fa18d564\") " Apr 22 18:53:37.215229 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.215060 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-model-cache" (OuterVolumeSpecName: "model-cache") pod "05b61c70-2187-40ea-ad4f-7797fa18d564" (UID: "05b61c70-2187-40ea-ad4f-7797fa18d564"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:37.215229 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.215078 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-home" (OuterVolumeSpecName: "home") pod "05b61c70-2187-40ea-ad4f-7797fa18d564" (UID: "05b61c70-2187-40ea-ad4f-7797fa18d564"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:37.215229 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.215186 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-model-cache\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.215229 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.215203 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-home\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.217186 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.217075 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-dshm" (OuterVolumeSpecName: "dshm") pod "05b61c70-2187-40ea-ad4f-7797fa18d564" (UID: "05b61c70-2187-40ea-ad4f-7797fa18d564"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:37.217303 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.217181 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05b61c70-2187-40ea-ad4f-7797fa18d564-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "05b61c70-2187-40ea-ad4f-7797fa18d564" (UID: "05b61c70-2187-40ea-ad4f-7797fa18d564"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:53:37.217303 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.217194 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b61c70-2187-40ea-ad4f-7797fa18d564-kube-api-access-hd29j" (OuterVolumeSpecName: "kube-api-access-hd29j") pod "05b61c70-2187-40ea-ad4f-7797fa18d564" (UID: "05b61c70-2187-40ea-ad4f-7797fa18d564"). InnerVolumeSpecName "kube-api-access-hd29j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:37.269307 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.269282 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "05b61c70-2187-40ea-ad4f-7797fa18d564" (UID: "05b61c70-2187-40ea-ad4f-7797fa18d564"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:37.316173 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.316093 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-kserve-provision-location\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.316173 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.316114 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05b61c70-2187-40ea-ad4f-7797fa18d564-tls-certs\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.316173 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.316147 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hd29j\" (UniqueName: \"kubernetes.io/projected/05b61c70-2187-40ea-ad4f-7797fa18d564-kube-api-access-hd29j\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.316173 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.316158 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05b61c70-2187-40ea-ad4f-7797fa18d564-dshm\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.976925 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.976894 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" event={"ID":"05b61c70-2187-40ea-ad4f-7797fa18d564","Type":"ContainerDied","Data":"a692f6ae7160c3879da2d1d6f0604e52c561c71ec868e9637b91d2cdfd761c3e"} Apr 22 18:53:37.977343 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.976933 2572 scope.go:117] "RemoveContainer" containerID="a995e353381472a663f31433a9cc0a002a1a2b0dd4a65567beeee603b866267c" Apr 22 18:53:37.977343 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.976953 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp" Apr 22 18:53:37.985283 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.985267 2572 scope.go:117] "RemoveContainer" containerID="46768acba1da8f3baa2d2b802aae6d0fd6383da871e279a86a366d475cb65572" Apr 22 18:53:37.994163 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.994138 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp"] Apr 22 18:53:37.997823 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:37.997801 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5dbbf47fd7-6vlzp"] Apr 22 18:53:39.868574 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:39.868540 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b61c70-2187-40ea-ad4f-7797fa18d564" path="/var/lib/kubelet/pods/05b61c70-2187-40ea-ad4f-7797fa18d564/volumes" Apr 22 18:53:45.737907 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.737828 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z"] Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738135 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="storage-initializer" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738151 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="storage-initializer" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738161 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="main" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738167 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="main" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738175 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05b61c70-2187-40ea-ad4f-7797fa18d564" containerName="storage-initializer" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738180 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b61c70-2187-40ea-ad4f-7797fa18d564" containerName="storage-initializer" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738194 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05b61c70-2187-40ea-ad4f-7797fa18d564" containerName="main" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738199 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b61c70-2187-40ea-ad4f-7797fa18d564" containerName="main" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738216 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="tokenizer" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738221 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="tokenizer" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738267 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="05b61c70-2187-40ea-ad4f-7797fa18d564" containerName="main" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738275 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="tokenizer" Apr 22 18:53:45.738382 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.738282 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="09a62540-6276-46a8-a27f-34f866de3566" containerName="main" Apr 22 18:53:45.741815 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.741793 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.744348 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.744327 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:53:45.745050 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.745031 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:53:45.745219 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.745070 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 18:53:45.745219 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.745176 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 18:53:45.745359 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.745221 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-lbdf5\"" Apr 22 18:53:45.752717 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.752676 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z"] Apr 22 18:53:45.780282 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.780252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.780282 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.780281 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.780450 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.780306 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.780450 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.780360 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15bc299c-a22b-4f65-8831-44344ec5dd75-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.780450 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.780392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grz56\" (UniqueName: \"kubernetes.io/projected/15bc299c-a22b-4f65-8831-44344ec5dd75-kube-api-access-grz56\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.780564 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.780461 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.881041 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.881011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grz56\" (UniqueName: \"kubernetes.io/projected/15bc299c-a22b-4f65-8831-44344ec5dd75-kube-api-access-grz56\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.881226 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.881082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.881226 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.881146 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.881226 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.881175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.881226 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.881200 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.881423 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.881243 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15bc299c-a22b-4f65-8831-44344ec5dd75-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.881550 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.881527 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.881612 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.881548 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.881612 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.881576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.881722 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.881678 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.884001 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.883978 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15bc299c-a22b-4f65-8831-44344ec5dd75-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:45.890051 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:45.890028 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grz56\" (UniqueName: \"kubernetes.io/projected/15bc299c-a22b-4f65-8831-44344ec5dd75-kube-api-access-grz56\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:46.055081 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:46.055013 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:46.184014 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:46.183991 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z"] Apr 22 18:53:46.186679 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:53:46.186651 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15bc299c_a22b_4f65_8831_44344ec5dd75.slice/crio-5fdb80763d8b37643efbb9486044e7435a473c6340a78f75e62d93fe00dc2d2e WatchSource:0}: Error finding container 5fdb80763d8b37643efbb9486044e7435a473c6340a78f75e62d93fe00dc2d2e: Status 404 returned error can't find the container with id 5fdb80763d8b37643efbb9486044e7435a473c6340a78f75e62d93fe00dc2d2e Apr 22 18:53:47.007727 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:47.007689 2572 generic.go:358] "Generic (PLEG): container finished" podID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerID="6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56" exitCode=0 Apr 22 18:53:47.008147 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:47.007778 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" event={"ID":"15bc299c-a22b-4f65-8831-44344ec5dd75","Type":"ContainerDied","Data":"6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56"} Apr 22 18:53:47.008147 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:47.007825 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" event={"ID":"15bc299c-a22b-4f65-8831-44344ec5dd75","Type":"ContainerStarted","Data":"5fdb80763d8b37643efbb9486044e7435a473c6340a78f75e62d93fe00dc2d2e"} Apr 22 18:53:48.012989 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:48.012950 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" event={"ID":"15bc299c-a22b-4f65-8831-44344ec5dd75","Type":"ContainerStarted","Data":"e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12"} Apr 22 18:53:48.012989 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:48.012984 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" event={"ID":"15bc299c-a22b-4f65-8831-44344ec5dd75","Type":"ContainerStarted","Data":"57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279"} Apr 22 18:53:48.013427 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:48.013076 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:48.032855 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:48.032805 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" podStartSLOduration=3.032787978 podStartE2EDuration="3.032787978s" podCreationTimestamp="2026-04-22 18:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:53:48.032497532 +0000 UTC m=+982.636651538" watchObservedRunningTime="2026-04-22 18:53:48.032787978 +0000 UTC m=+982.636941985" Apr 22 18:53:56.056030 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:56.055995 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:56.056030 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:56.056039 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:56.058743 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:56.058717 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:53:57.041825 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:53:57.041796 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:54:18.045031 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:54:18.044999 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:56:06.962510 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:06.962473 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z"] Apr 22 18:56:06.962992 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:06.962770 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="main" containerID="cri-o://57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279" gracePeriod=30 Apr 22 18:56:06.962992 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:06.962832 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="tokenizer" containerID="cri-o://e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12" gracePeriod=30 Apr 22 18:56:07.041903 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:07.041866 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.28:8082/healthz\": dial tcp 10.132.0.28:8082: connect: connection refused" Apr 22 18:56:07.459037 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:07.458999 2572 generic.go:358] "Generic (PLEG): container finished" podID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerID="57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279" exitCode=0 Apr 22 18:56:07.459229 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:07.459076 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" event={"ID":"15bc299c-a22b-4f65-8831-44344ec5dd75","Type":"ContainerDied","Data":"57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279"} Apr 22 18:56:08.100116 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.100094 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:56:08.277241 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.277181 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-kserve-provision-location\") pod \"15bc299c-a22b-4f65-8831-44344ec5dd75\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " Apr 22 18:56:08.277241 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.277212 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-cache\") pod \"15bc299c-a22b-4f65-8831-44344ec5dd75\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " Apr 22 18:56:08.277462 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.277251 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15bc299c-a22b-4f65-8831-44344ec5dd75-tls-certs\") pod \"15bc299c-a22b-4f65-8831-44344ec5dd75\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " Apr 22 18:56:08.277462 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.277269 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-tmp\") pod \"15bc299c-a22b-4f65-8831-44344ec5dd75\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " Apr 22 18:56:08.277462 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.277304 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-uds\") pod \"15bc299c-a22b-4f65-8831-44344ec5dd75\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " Apr 22 18:56:08.277462 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.277326 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grz56\" (UniqueName: \"kubernetes.io/projected/15bc299c-a22b-4f65-8831-44344ec5dd75-kube-api-access-grz56\") pod \"15bc299c-a22b-4f65-8831-44344ec5dd75\" (UID: \"15bc299c-a22b-4f65-8831-44344ec5dd75\") " Apr 22 18:56:08.277667 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.277532 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "15bc299c-a22b-4f65-8831-44344ec5dd75" (UID: "15bc299c-a22b-4f65-8831-44344ec5dd75"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:56:08.277667 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.277578 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "15bc299c-a22b-4f65-8831-44344ec5dd75" (UID: "15bc299c-a22b-4f65-8831-44344ec5dd75"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:56:08.277667 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.277628 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "15bc299c-a22b-4f65-8831-44344ec5dd75" (UID: "15bc299c-a22b-4f65-8831-44344ec5dd75"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:56:08.278036 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.278013 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "15bc299c-a22b-4f65-8831-44344ec5dd75" (UID: "15bc299c-a22b-4f65-8831-44344ec5dd75"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:56:08.279512 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.279492 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15bc299c-a22b-4f65-8831-44344ec5dd75-kube-api-access-grz56" (OuterVolumeSpecName: "kube-api-access-grz56") pod "15bc299c-a22b-4f65-8831-44344ec5dd75" (UID: "15bc299c-a22b-4f65-8831-44344ec5dd75"). InnerVolumeSpecName "kube-api-access-grz56". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:08.279596 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.279529 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15bc299c-a22b-4f65-8831-44344ec5dd75-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "15bc299c-a22b-4f65-8831-44344ec5dd75" (UID: "15bc299c-a22b-4f65-8831-44344ec5dd75"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:56:08.377804 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.377781 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15bc299c-a22b-4f65-8831-44344ec5dd75-tls-certs\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:56:08.377804 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.377803 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-tmp\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:56:08.377925 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.377812 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-uds\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:56:08.377925 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.377820 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grz56\" (UniqueName: \"kubernetes.io/projected/15bc299c-a22b-4f65-8831-44344ec5dd75-kube-api-access-grz56\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:56:08.377925 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.377830 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-kserve-provision-location\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:56:08.377925 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.377839 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15bc299c-a22b-4f65-8831-44344ec5dd75-tokenizer-cache\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:56:08.463956 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.463928 2572 generic.go:358] "Generic (PLEG): container finished" podID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerID="e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12" exitCode=0 Apr 22 18:56:08.464056 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.463969 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" event={"ID":"15bc299c-a22b-4f65-8831-44344ec5dd75","Type":"ContainerDied","Data":"e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12"} Apr 22 18:56:08.464056 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.463995 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" Apr 22 18:56:08.464056 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.464012 2572 scope.go:117] "RemoveContainer" containerID="e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12" Apr 22 18:56:08.464235 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.463998 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" event={"ID":"15bc299c-a22b-4f65-8831-44344ec5dd75","Type":"ContainerDied","Data":"5fdb80763d8b37643efbb9486044e7435a473c6340a78f75e62d93fe00dc2d2e"} Apr 22 18:56:08.472605 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.472587 2572 scope.go:117] "RemoveContainer" containerID="57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279" Apr 22 18:56:08.479709 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.479692 2572 scope.go:117] "RemoveContainer" containerID="6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56" Apr 22 18:56:08.487022 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.487000 2572 scope.go:117] "RemoveContainer" containerID="e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12" Apr 22 18:56:08.487144 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.487021 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z"] Apr 22 18:56:08.487309 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:56:08.487290 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12\": container with ID starting with e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12 not found: ID does not exist" containerID="e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12" Apr 22 18:56:08.487370 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.487317 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12"} err="failed to get container status \"e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12\": rpc error: code = NotFound desc = could not find container \"e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12\": container with ID starting with e55f5ad536d5f408b5261a54fa056714d1c8c62ad185de106372c3619e20ac12 not found: ID does not exist" Apr 22 18:56:08.487370 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.487336 2572 scope.go:117] "RemoveContainer" containerID="57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279" Apr 22 18:56:08.487567 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:56:08.487553 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279\": container with ID starting with 57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279 not found: ID does not exist" containerID="57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279" Apr 22 18:56:08.487606 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.487571 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279"} err="failed to get container status \"57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279\": rpc error: code = NotFound desc = could not find container \"57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279\": container with ID starting with 57be1e213bb046d71cb4a9f32240420c79a0b70375abd583a26200bef4f3a279 not found: ID does not exist" Apr 22 18:56:08.487606 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.487585 2572 scope.go:117] "RemoveContainer" containerID="6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56" Apr 22 18:56:08.487808 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:56:08.487794 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56\": container with ID starting with 6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56 not found: ID does not exist" containerID="6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56" Apr 22 18:56:08.487850 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.487810 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56"} err="failed to get container status \"6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56\": rpc error: code = NotFound desc = could not find container \"6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56\": container with ID starting with 6317f47a64908e3021c3d9524a30fa253e691203475c14e8eb4ebc7752a28b56 not found: ID does not exist" Apr 22 18:56:08.490778 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:08.490757 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z"] Apr 22 18:56:09.045357 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:09.045315 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-cxj6z" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.28:9003\" within 1s: context deadline exceeded" Apr 22 18:56:09.045519 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:56:09.045368 2572 logging.go:55] [core] [Channel #104 SubChannel #105]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.28:9003", ServerName: "10.132.0.28:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.28:9003: operation was canceled" Apr 22 18:56:09.868862 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:56:09.868825 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" path="/var/lib/kubelet/pods/15bc299c-a22b-4f65-8831-44344ec5dd75/volumes" Apr 22 18:57:05.487188 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.487095 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j"] Apr 22 18:57:05.487577 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.487497 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="storage-initializer" Apr 22 18:57:05.487577 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.487510 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="storage-initializer" Apr 22 18:57:05.487577 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.487521 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="main" Apr 22 18:57:05.487577 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.487526 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="main" Apr 22 18:57:05.487577 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.487536 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="tokenizer" Apr 22 18:57:05.487577 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.487541 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="tokenizer" Apr 22 18:57:05.487849 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.487591 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="main" Apr 22 18:57:05.487849 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.487599 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="15bc299c-a22b-4f65-8831-44344ec5dd75" containerName="tokenizer" Apr 22 18:57:05.489644 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.489627 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.492153 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.492111 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 18:57:05.492257 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.492113 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 18:57:05.492257 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.492114 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-ms5mc\"" Apr 22 18:57:05.493010 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.492990 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:57:05.493140 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.493068 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:57:05.504784 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.504764 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j"] Apr 22 18:57:05.590301 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.590267 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b763797b-c82f-4799-a000-856f390ed6cd-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.590460 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.590310 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp8w4\" (UniqueName: \"kubernetes.io/projected/b763797b-c82f-4799-a000-856f390ed6cd-kube-api-access-xp8w4\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.590460 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.590393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.590460 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.590419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.590460 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.590438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.590619 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.590459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.691881 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.691839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xp8w4\" (UniqueName: \"kubernetes.io/projected/b763797b-c82f-4799-a000-856f390ed6cd-kube-api-access-xp8w4\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.692095 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.691923 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.692095 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.691946 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.692095 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.691964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.692095 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.691993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.692095 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.692017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b763797b-c82f-4799-a000-856f390ed6cd-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.692401 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.692387 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.692460 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.692417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.692517 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.692480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.692558 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.692513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.694755 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.694734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b763797b-c82f-4799-a000-856f390ed6cd-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.706156 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.706114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp8w4\" (UniqueName: \"kubernetes.io/projected/b763797b-c82f-4799-a000-856f390ed6cd-kube-api-access-xp8w4\") pod \"stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.799206 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.799105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:05.931779 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.931745 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j"] Apr 22 18:57:05.934898 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:57:05.934868 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb763797b_c82f_4799_a000_856f390ed6cd.slice/crio-8f81f1d153be90991a133a8614ec5940aa3dfb42ed392b29c2fef10953bf9bfe WatchSource:0}: Error finding container 8f81f1d153be90991a133a8614ec5940aa3dfb42ed392b29c2fef10953bf9bfe: Status 404 returned error can't find the container with id 8f81f1d153be90991a133a8614ec5940aa3dfb42ed392b29c2fef10953bf9bfe Apr 22 18:57:05.937071 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:05.937054 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:57:06.652656 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:06.652617 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" event={"ID":"b763797b-c82f-4799-a000-856f390ed6cd","Type":"ContainerStarted","Data":"df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24"} Apr 22 18:57:06.652656 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:06.652657 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" event={"ID":"b763797b-c82f-4799-a000-856f390ed6cd","Type":"ContainerStarted","Data":"8f81f1d153be90991a133a8614ec5940aa3dfb42ed392b29c2fef10953bf9bfe"} Apr 22 18:57:07.657454 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:07.657419 2572 generic.go:358] "Generic (PLEG): container finished" podID="b763797b-c82f-4799-a000-856f390ed6cd" containerID="df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24" exitCode=0 Apr 22 18:57:07.657909 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:07.657502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" event={"ID":"b763797b-c82f-4799-a000-856f390ed6cd","Type":"ContainerDied","Data":"df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24"} Apr 22 18:57:08.663467 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:08.663433 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" event={"ID":"b763797b-c82f-4799-a000-856f390ed6cd","Type":"ContainerStarted","Data":"68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6"} Apr 22 18:57:08.663467 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:08.663470 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" event={"ID":"b763797b-c82f-4799-a000-856f390ed6cd","Type":"ContainerStarted","Data":"a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca"} Apr 22 18:57:08.663853 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:08.663568 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:08.683884 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:08.683844 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" podStartSLOduration=3.68383264 podStartE2EDuration="3.68383264s" podCreationTimestamp="2026-04-22 18:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:08.682890211 +0000 UTC m=+1183.287044246" watchObservedRunningTime="2026-04-22 18:57:08.68383264 +0000 UTC m=+1183.287986644" Apr 22 18:57:15.799658 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:15.799625 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:15.800255 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:15.799672 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:15.802409 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:15.802388 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:16.690620 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:16.690591 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:37.694408 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:37.694374 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:57:40.174266 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:40.174230 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:57:40.176670 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:40.176643 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 18:57:40.182630 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:40.182611 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:57:40.185251 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:57:40.185229 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 18:59:17.418639 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:17.418603 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j"] Apr 22 18:59:17.419069 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:17.419017 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="main" containerID="cri-o://a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca" gracePeriod=30 Apr 22 18:59:17.419193 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:17.419110 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="tokenizer" containerID="cri-o://68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6" gracePeriod=30 Apr 22 18:59:17.693385 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:59:17.693312 2572 logging.go:55] [core] [Channel #175 SubChannel #176]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.29:9003", ServerName: "10.132.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.29:9003: connect: connection refused" Apr 22 18:59:18.080720 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.080689 2572 generic.go:358] "Generic (PLEG): container finished" podID="b763797b-c82f-4799-a000-856f390ed6cd" containerID="a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca" exitCode=0 Apr 22 18:59:18.080882 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.080765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" event={"ID":"b763797b-c82f-4799-a000-856f390ed6cd","Type":"ContainerDied","Data":"a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca"} Apr 22 18:59:18.567012 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.566992 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:59:18.682809 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.682752 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp8w4\" (UniqueName: \"kubernetes.io/projected/b763797b-c82f-4799-a000-856f390ed6cd-kube-api-access-xp8w4\") pod \"b763797b-c82f-4799-a000-856f390ed6cd\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " Apr 22 18:59:18.682809 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.682786 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-uds\") pod \"b763797b-c82f-4799-a000-856f390ed6cd\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " Apr 22 18:59:18.682809 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.682806 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-cache\") pod \"b763797b-c82f-4799-a000-856f390ed6cd\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " Apr 22 18:59:18.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.682837 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-kserve-provision-location\") pod \"b763797b-c82f-4799-a000-856f390ed6cd\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " Apr 22 18:59:18.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.682875 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b763797b-c82f-4799-a000-856f390ed6cd-tls-certs\") pod \"b763797b-c82f-4799-a000-856f390ed6cd\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " Apr 22 18:59:18.683021 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.682948 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-tmp\") pod \"b763797b-c82f-4799-a000-856f390ed6cd\" (UID: \"b763797b-c82f-4799-a000-856f390ed6cd\") " Apr 22 18:59:18.683228 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.683040 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b763797b-c82f-4799-a000-856f390ed6cd" (UID: "b763797b-c82f-4799-a000-856f390ed6cd"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:18.683228 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.683079 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b763797b-c82f-4799-a000-856f390ed6cd" (UID: "b763797b-c82f-4799-a000-856f390ed6cd"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:18.683228 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.683213 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-uds\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:59:18.683228 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.683227 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-cache\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:59:18.683385 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.683337 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b763797b-c82f-4799-a000-856f390ed6cd" (UID: "b763797b-c82f-4799-a000-856f390ed6cd"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:18.683603 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.683583 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b763797b-c82f-4799-a000-856f390ed6cd" (UID: "b763797b-c82f-4799-a000-856f390ed6cd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:18.685054 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.685031 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b763797b-c82f-4799-a000-856f390ed6cd-kube-api-access-xp8w4" (OuterVolumeSpecName: "kube-api-access-xp8w4") pod "b763797b-c82f-4799-a000-856f390ed6cd" (UID: "b763797b-c82f-4799-a000-856f390ed6cd"). InnerVolumeSpecName "kube-api-access-xp8w4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:59:18.685158 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.685138 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b763797b-c82f-4799-a000-856f390ed6cd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b763797b-c82f-4799-a000-856f390ed6cd" (UID: "b763797b-c82f-4799-a000-856f390ed6cd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:59:18.693328 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.693301 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.29:9003\" within 1s: context deadline exceeded" Apr 22 18:59:18.783570 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.783550 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-kserve-provision-location\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:59:18.783659 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.783574 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b763797b-c82f-4799-a000-856f390ed6cd-tls-certs\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:59:18.783659 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.783583 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b763797b-c82f-4799-a000-856f390ed6cd-tokenizer-tmp\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:59:18.783659 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:18.783593 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xp8w4\" (UniqueName: \"kubernetes.io/projected/b763797b-c82f-4799-a000-856f390ed6cd-kube-api-access-xp8w4\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 18:59:19.085851 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.085818 2572 generic.go:358] "Generic (PLEG): container finished" podID="b763797b-c82f-4799-a000-856f390ed6cd" containerID="68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6" exitCode=0 Apr 22 18:59:19.086018 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.085896 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" Apr 22 18:59:19.086018 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.085902 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" event={"ID":"b763797b-c82f-4799-a000-856f390ed6cd","Type":"ContainerDied","Data":"68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6"} Apr 22 18:59:19.086018 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.085939 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j" event={"ID":"b763797b-c82f-4799-a000-856f390ed6cd","Type":"ContainerDied","Data":"8f81f1d153be90991a133a8614ec5940aa3dfb42ed392b29c2fef10953bf9bfe"} Apr 22 18:59:19.086018 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.085955 2572 scope.go:117] "RemoveContainer" containerID="68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6" Apr 22 18:59:19.094224 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.094208 2572 scope.go:117] "RemoveContainer" containerID="a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca" Apr 22 18:59:19.101355 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.101336 2572 scope.go:117] "RemoveContainer" containerID="df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24" Apr 22 18:59:19.109944 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.109914 2572 scope.go:117] "RemoveContainer" containerID="68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6" Apr 22 18:59:19.110101 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.109960 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j"] Apr 22 18:59:19.110250 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:59:19.110232 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6\": container with ID starting with 68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6 not found: ID does not exist" containerID="68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6" Apr 22 18:59:19.110327 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.110265 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6"} err="failed to get container status \"68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6\": rpc error: code = NotFound desc = could not find container \"68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6\": container with ID starting with 68dc3dc3aa6b28d4bdcaccb580fe4e270dc36b8a19f23ce33a8c29f8470d9ec6 not found: ID does not exist" Apr 22 18:59:19.110327 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.110285 2572 scope.go:117] "RemoveContainer" containerID="a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca" Apr 22 18:59:19.110549 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:59:19.110525 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca\": container with ID starting with a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca not found: ID does not exist" containerID="a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca" Apr 22 18:59:19.110617 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.110548 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca"} err="failed to get container status \"a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca\": rpc error: code = NotFound desc = could not find container \"a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca\": container with ID starting with a65952ee6cd78f3d1f0b5fc34c045ee8962a5eab9e3b4382baf94d4581fe55ca not found: ID does not exist" Apr 22 18:59:19.110617 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.110565 2572 scope.go:117] "RemoveContainer" containerID="df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24" Apr 22 18:59:19.110818 ip-10-0-129-26 kubenswrapper[2572]: E0422 18:59:19.110800 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24\": container with ID starting with df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24 not found: ID does not exist" containerID="df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24" Apr 22 18:59:19.110875 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.110823 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24"} err="failed to get container status \"df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24\": rpc error: code = NotFound desc = could not find container \"df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24\": container with ID starting with df5a037d592c94cb58839a3aeb78533525e9092914ecc8c8dbb0672d3f28ae24 not found: ID does not exist" Apr 22 18:59:19.114235 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.114216 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-687b6fb8b-8s74j"] Apr 22 18:59:19.781700 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.781672 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc"] Apr 22 18:59:19.782060 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.781965 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="main" Apr 22 18:59:19.782060 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.781975 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="main" Apr 22 18:59:19.782060 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.781990 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="storage-initializer" Apr 22 18:59:19.782060 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.781995 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="storage-initializer" Apr 22 18:59:19.782060 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.782002 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="tokenizer" Apr 22 18:59:19.782060 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.782007 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="tokenizer" Apr 22 18:59:19.782060 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.782055 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="main" Apr 22 18:59:19.782060 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.782064 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b763797b-c82f-4799-a000-856f390ed6cd" containerName="tokenizer" Apr 22 18:59:19.786360 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.786343 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" Apr 22 18:59:19.788642 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.788622 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:59:19.789461 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.789443 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:59:19.789558 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.789460 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-dt769\"" Apr 22 18:59:19.789558 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.789500 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:59:19.792672 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.792647 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc"] Apr 22 18:59:19.870769 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.870739 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b763797b-c82f-4799-a000-856f390ed6cd" path="/var/lib/kubelet/pods/b763797b-c82f-4799-a000-856f390ed6cd/volumes" Apr 22 18:59:19.888479 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.888457 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4mth\" (UniqueName: \"kubernetes.io/projected/5ff36bed-0e8c-4dd6-88f8-08572cf682f1-kube-api-access-h4mth\") pod \"llmisvc-controller-manager-7c8976c8d9-8qzsc\" (UID: \"5ff36bed-0e8c-4dd6-88f8-08572cf682f1\") " pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" Apr 22 18:59:19.888598 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.888490 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ff36bed-0e8c-4dd6-88f8-08572cf682f1-cert\") pod \"llmisvc-controller-manager-7c8976c8d9-8qzsc\" (UID: \"5ff36bed-0e8c-4dd6-88f8-08572cf682f1\") " pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" Apr 22 18:59:19.989353 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.989326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ff36bed-0e8c-4dd6-88f8-08572cf682f1-cert\") pod \"llmisvc-controller-manager-7c8976c8d9-8qzsc\" (UID: \"5ff36bed-0e8c-4dd6-88f8-08572cf682f1\") " pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" Apr 22 18:59:19.989464 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.989422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4mth\" (UniqueName: \"kubernetes.io/projected/5ff36bed-0e8c-4dd6-88f8-08572cf682f1-kube-api-access-h4mth\") pod \"llmisvc-controller-manager-7c8976c8d9-8qzsc\" (UID: \"5ff36bed-0e8c-4dd6-88f8-08572cf682f1\") " pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" Apr 22 18:59:19.991786 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.991768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ff36bed-0e8c-4dd6-88f8-08572cf682f1-cert\") pod \"llmisvc-controller-manager-7c8976c8d9-8qzsc\" (UID: \"5ff36bed-0e8c-4dd6-88f8-08572cf682f1\") " pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" Apr 22 18:59:19.997846 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:19.997824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4mth\" (UniqueName: \"kubernetes.io/projected/5ff36bed-0e8c-4dd6-88f8-08572cf682f1-kube-api-access-h4mth\") pod \"llmisvc-controller-manager-7c8976c8d9-8qzsc\" (UID: \"5ff36bed-0e8c-4dd6-88f8-08572cf682f1\") " pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" Apr 22 18:59:20.097111 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:20.097086 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" Apr 22 18:59:20.219475 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:20.219453 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc"] Apr 22 18:59:20.222047 ip-10-0-129-26 kubenswrapper[2572]: W0422 18:59:20.222018 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5ff36bed_0e8c_4dd6_88f8_08572cf682f1.slice/crio-c375862987ca3184c16e55aed9850bc688776518bd85495ce546e9151a95a11f WatchSource:0}: Error finding container c375862987ca3184c16e55aed9850bc688776518bd85495ce546e9151a95a11f: Status 404 returned error can't find the container with id c375862987ca3184c16e55aed9850bc688776518bd85495ce546e9151a95a11f Apr 22 18:59:21.094396 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:21.094364 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" event={"ID":"5ff36bed-0e8c-4dd6-88f8-08572cf682f1","Type":"ContainerStarted","Data":"c375862987ca3184c16e55aed9850bc688776518bd85495ce546e9151a95a11f"} Apr 22 18:59:24.104923 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:24.104846 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" event={"ID":"5ff36bed-0e8c-4dd6-88f8-08572cf682f1","Type":"ContainerStarted","Data":"db1b4c8ddc9e9348b0d326d5ab8a3a947feda38a0391ab628d9311a3bb8de763"} Apr 22 18:59:24.105288 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:24.104971 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" Apr 22 18:59:24.122002 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:24.121950 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" podStartSLOduration=1.515047806 podStartE2EDuration="5.121936345s" podCreationTimestamp="2026-04-22 18:59:19 +0000 UTC" firstStartedPulling="2026-04-22 18:59:20.22326094 +0000 UTC m=+1314.827414922" lastFinishedPulling="2026-04-22 18:59:23.830149479 +0000 UTC m=+1318.434303461" observedRunningTime="2026-04-22 18:59:24.121425842 +0000 UTC m=+1318.725579849" watchObservedRunningTime="2026-04-22 18:59:24.121936345 +0000 UTC m=+1318.726090349" Apr 22 18:59:55.109752 ip-10-0-129-26 kubenswrapper[2572]: I0422 18:59:55.109675 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-7c8976c8d9-8qzsc" Apr 22 19:02:40.201715 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:02:40.201676 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 19:02:40.204657 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:02:40.204633 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 19:02:40.207570 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:02:40.207548 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 19:02:40.210492 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:02:40.210468 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 19:03:48.880353 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:48.880323 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8"] Apr 22 19:03:48.882779 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:48.882755 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:48.887665 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:48.887645 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:03:48.887828 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:48.887809 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 19:03:48.887897 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:48.887881 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 19:03:48.888831 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:48.888812 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-qpgv6\"" Apr 22 19:03:48.889082 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:48.888870 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 19:03:48.899193 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:48.899168 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8"] Apr 22 19:03:49.034864 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.034831 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.035017 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.034873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.035017 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.034893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.035017 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.034947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.035017 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.034966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.035017 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.034988 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv9lm\" (UniqueName: \"kubernetes.io/projected/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kube-api-access-vv9lm\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.136191 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.136097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.136191 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.136153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.136191 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.136177 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv9lm\" (UniqueName: \"kubernetes.io/projected/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kube-api-access-vv9lm\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.136424 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.136211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.136424 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.136238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.136424 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.136253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.136565 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.136486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.136622 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.136573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.136622 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.136581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.136691 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.136632 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.138846 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.138823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.145183 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.145159 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv9lm\" (UniqueName: \"kubernetes.io/projected/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kube-api-access-vv9lm\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.194178 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.194155 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:49.316730 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.316706 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8"] Apr 22 19:03:49.319332 ip-10-0-129-26 kubenswrapper[2572]: W0422 19:03:49.319297 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd04795cc_8a8c_4535_af5e_b8ffd62e22ee.slice/crio-f55b9e79c3852e07a95f4b0dd36982152171e8473be718170d80603560341e02 WatchSource:0}: Error finding container f55b9e79c3852e07a95f4b0dd36982152171e8473be718170d80603560341e02: Status 404 returned error can't find the container with id f55b9e79c3852e07a95f4b0dd36982152171e8473be718170d80603560341e02 Apr 22 19:03:49.321336 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.321316 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:03:49.941898 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.941866 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" event={"ID":"d04795cc-8a8c-4535-af5e-b8ffd62e22ee","Type":"ContainerStarted","Data":"37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262"} Apr 22 19:03:49.942259 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:49.941905 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" event={"ID":"d04795cc-8a8c-4535-af5e-b8ffd62e22ee","Type":"ContainerStarted","Data":"f55b9e79c3852e07a95f4b0dd36982152171e8473be718170d80603560341e02"} Apr 22 19:03:50.946727 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:50.946692 2572 generic.go:358] "Generic (PLEG): container finished" podID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerID="37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262" exitCode=0 Apr 22 19:03:50.947165 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:50.946778 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" event={"ID":"d04795cc-8a8c-4535-af5e-b8ffd62e22ee","Type":"ContainerDied","Data":"37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262"} Apr 22 19:03:51.952174 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:51.952141 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" event={"ID":"d04795cc-8a8c-4535-af5e-b8ffd62e22ee","Type":"ContainerStarted","Data":"563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8"} Apr 22 19:03:51.952174 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:51.952179 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" event={"ID":"d04795cc-8a8c-4535-af5e-b8ffd62e22ee","Type":"ContainerStarted","Data":"cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55"} Apr 22 19:03:51.952572 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:51.952275 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:51.974186 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:51.974146 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" podStartSLOduration=3.974115532 podStartE2EDuration="3.974115532s" podCreationTimestamp="2026-04-22 19:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:03:51.971468119 +0000 UTC m=+1586.575622122" watchObservedRunningTime="2026-04-22 19:03:51.974115532 +0000 UTC m=+1586.578269535" Apr 22 19:03:59.194775 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:59.194740 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:59.194775 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:59.194784 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:59.197158 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:59.197118 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:03:59.979156 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:03:59.979109 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:04:20.982658 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:04:20.982626 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:06:26.224346 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.224315 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv"] Apr 22 19:06:26.227773 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.227756 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.230267 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.230245 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 19:06:26.230358 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.230247 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-tfd5j\"" Apr 22 19:06:26.236439 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.236421 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv"] Apr 22 19:06:26.336462 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.336425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.336632 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.336486 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.336632 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.336514 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rwj\" (UniqueName: \"kubernetes.io/projected/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kube-api-access-w8rwj\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.336632 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.336543 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.336632 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.336610 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.336632 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.336631 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.437093 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.437054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.437093 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.437099 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rwj\" (UniqueName: \"kubernetes.io/projected/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kube-api-access-w8rwj\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.437423 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.437161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.437423 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.437190 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.437423 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.437209 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.437423 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.437320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.437632 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.437590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.437691 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.437667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.437734 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.437695 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.437783 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.437764 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.439889 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.439866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.446158 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.446117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rwj\" (UniqueName: \"kubernetes.io/projected/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kube-api-access-w8rwj\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.538283 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.538188 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:26.661451 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:26.661423 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv"] Apr 22 19:06:26.664193 ip-10-0-129-26 kubenswrapper[2572]: W0422 19:06:26.664164 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce64622d_c974_44a7_97a3_798b5cfb7fdc.slice/crio-d903c003d0864f3cfea204f13b9c8ed6dbc632d55467b63872aa29643d836c60 WatchSource:0}: Error finding container d903c003d0864f3cfea204f13b9c8ed6dbc632d55467b63872aa29643d836c60: Status 404 returned error can't find the container with id d903c003d0864f3cfea204f13b9c8ed6dbc632d55467b63872aa29643d836c60 Apr 22 19:06:27.462776 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:27.462750 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerID="9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e" exitCode=0 Apr 22 19:06:27.463055 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:27.462799 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" event={"ID":"ce64622d-c974-44a7-97a3-798b5cfb7fdc","Type":"ContainerDied","Data":"9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e"} Apr 22 19:06:27.463055 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:27.462822 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" event={"ID":"ce64622d-c974-44a7-97a3-798b5cfb7fdc","Type":"ContainerStarted","Data":"d903c003d0864f3cfea204f13b9c8ed6dbc632d55467b63872aa29643d836c60"} Apr 22 19:06:28.469078 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:28.469040 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" event={"ID":"ce64622d-c974-44a7-97a3-798b5cfb7fdc","Type":"ContainerStarted","Data":"dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8"} Apr 22 19:06:28.469078 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:28.469084 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" event={"ID":"ce64622d-c974-44a7-97a3-798b5cfb7fdc","Type":"ContainerStarted","Data":"a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a"} Apr 22 19:06:28.469500 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:28.469236 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:28.489494 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:28.489431 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" podStartSLOduration=2.48941751 podStartE2EDuration="2.48941751s" podCreationTimestamp="2026-04-22 19:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:28.488137509 +0000 UTC m=+1743.092291531" watchObservedRunningTime="2026-04-22 19:06:28.48941751 +0000 UTC m=+1743.093571514" Apr 22 19:06:36.538374 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:36.538338 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:36.538816 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:36.538383 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:36.541183 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:36.541160 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:37.501258 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:37.501231 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:06:54.351968 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:54.351939 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8"] Apr 22 19:06:54.352495 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:54.352267 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerName="main" containerID="cri-o://cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55" gracePeriod=30 Apr 22 19:06:54.352495 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:54.352312 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerName="tokenizer" containerID="cri-o://563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8" gracePeriod=30 Apr 22 19:06:54.557355 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:54.557319 2572 generic.go:358] "Generic (PLEG): container finished" podID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerID="cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55" exitCode=0 Apr 22 19:06:54.557509 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:54.557384 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" event={"ID":"d04795cc-8a8c-4535-af5e-b8ffd62e22ee","Type":"ContainerDied","Data":"cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55"} Apr 22 19:06:55.507177 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.507152 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:06:55.559111 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559040 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv9lm\" (UniqueName: \"kubernetes.io/projected/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kube-api-access-vv9lm\") pod \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " Apr 22 19:06:55.559111 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559080 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-cache\") pod \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " Apr 22 19:06:55.559307 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559117 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tls-certs\") pod \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " Apr 22 19:06:55.559307 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559298 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kserve-provision-location\") pod \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " Apr 22 19:06:55.559410 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559328 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-uds\") pod \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " Apr 22 19:06:55.559410 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559364 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-tmp\") pod \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\" (UID: \"d04795cc-8a8c-4535-af5e-b8ffd62e22ee\") " Apr 22 19:06:55.559410 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559376 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d04795cc-8a8c-4535-af5e-b8ffd62e22ee" (UID: "d04795cc-8a8c-4535-af5e-b8ffd62e22ee"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:55.559565 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559546 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d04795cc-8a8c-4535-af5e-b8ffd62e22ee" (UID: "d04795cc-8a8c-4535-af5e-b8ffd62e22ee"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:55.559771 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559729 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-cache\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.559771 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559754 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-uds\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.559904 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.559825 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d04795cc-8a8c-4535-af5e-b8ffd62e22ee" (UID: "d04795cc-8a8c-4535-af5e-b8ffd62e22ee"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:55.560374 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.560346 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d04795cc-8a8c-4535-af5e-b8ffd62e22ee" (UID: "d04795cc-8a8c-4535-af5e-b8ffd62e22ee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:55.561472 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.561442 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kube-api-access-vv9lm" (OuterVolumeSpecName: "kube-api-access-vv9lm") pod "d04795cc-8a8c-4535-af5e-b8ffd62e22ee" (UID: "d04795cc-8a8c-4535-af5e-b8ffd62e22ee"). InnerVolumeSpecName "kube-api-access-vv9lm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:55.561472 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.561451 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d04795cc-8a8c-4535-af5e-b8ffd62e22ee" (UID: "d04795cc-8a8c-4535-af5e-b8ffd62e22ee"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:55.562771 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.562750 2572 generic.go:358] "Generic (PLEG): container finished" podID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerID="563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8" exitCode=0 Apr 22 19:06:55.562853 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.562779 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" event={"ID":"d04795cc-8a8c-4535-af5e-b8ffd62e22ee","Type":"ContainerDied","Data":"563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8"} Apr 22 19:06:55.562853 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.562808 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" event={"ID":"d04795cc-8a8c-4535-af5e-b8ffd62e22ee","Type":"ContainerDied","Data":"f55b9e79c3852e07a95f4b0dd36982152171e8473be718170d80603560341e02"} Apr 22 19:06:55.562853 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.562824 2572 scope.go:117] "RemoveContainer" containerID="563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8" Apr 22 19:06:55.562988 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.562864 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8" Apr 22 19:06:55.579841 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.579826 2572 scope.go:117] "RemoveContainer" containerID="cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55" Apr 22 19:06:55.586971 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.586956 2572 scope.go:117] "RemoveContainer" containerID="37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262" Apr 22 19:06:55.591202 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.591184 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8"] Apr 22 19:06:55.594820 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.594803 2572 scope.go:117] "RemoveContainer" containerID="563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8" Apr 22 19:06:55.595082 ip-10-0-129-26 kubenswrapper[2572]: E0422 19:06:55.595063 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8\": container with ID starting with 563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8 not found: ID does not exist" containerID="563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8" Apr 22 19:06:55.595180 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.595090 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8"} err="failed to get container status \"563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8\": rpc error: code = NotFound desc = could not find container \"563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8\": container with ID starting with 563054de956e97c0da1fa2446b952c00870905de6112ab081b119bc203b058b8 not found: ID does not exist" Apr 22 19:06:55.595180 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.595108 2572 scope.go:117] "RemoveContainer" containerID="cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55" Apr 22 19:06:55.595373 ip-10-0-129-26 kubenswrapper[2572]: E0422 19:06:55.595349 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55\": container with ID starting with cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55 not found: ID does not exist" containerID="cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55" Apr 22 19:06:55.595421 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.595394 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schevgxx8"] Apr 22 19:06:55.595421 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.595383 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55"} err="failed to get container status \"cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55\": rpc error: code = NotFound desc = could not find container \"cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55\": container with ID starting with cd5f81613951af47988cf81abb6f7e7b0c3a3cd65d22dffa686380ab6984ca55 not found: ID does not exist" Apr 22 19:06:55.595499 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.595440 2572 scope.go:117] "RemoveContainer" containerID="37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262" Apr 22 19:06:55.595661 ip-10-0-129-26 kubenswrapper[2572]: E0422 19:06:55.595641 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262\": container with ID starting with 37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262 not found: ID does not exist" containerID="37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262" Apr 22 19:06:55.595745 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.595665 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262"} err="failed to get container status \"37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262\": rpc error: code = NotFound desc = could not find container \"37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262\": container with ID starting with 37e46b74f6162bb346ffde6ff4032ff6549f713d5a2125b934d3c33b27400262 not found: ID does not exist" Apr 22 19:06:55.660780 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.660755 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vv9lm\" (UniqueName: \"kubernetes.io/projected/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kube-api-access-vv9lm\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.660780 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.660779 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tls-certs\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.660904 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.660790 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-kserve-provision-location\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.660904 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.660799 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d04795cc-8a8c-4535-af5e-b8ffd62e22ee-tokenizer-tmp\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:06:55.869179 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:55.869154 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" path="/var/lib/kubelet/pods/d04795cc-8a8c-4535-af5e-b8ffd62e22ee/volumes" Apr 22 19:06:58.504926 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:06:58.504899 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:07:22.450390 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.450359 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz"] Apr 22 19:07:22.450884 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.450804 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerName="tokenizer" Apr 22 19:07:22.450884 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.450824 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerName="tokenizer" Apr 22 19:07:22.450884 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.450838 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerName="storage-initializer" Apr 22 19:07:22.450884 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.450846 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerName="storage-initializer" Apr 22 19:07:22.450884 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.450866 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerName="main" Apr 22 19:07:22.450884 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.450874 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerName="main" Apr 22 19:07:22.451198 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.450961 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerName="tokenizer" Apr 22 19:07:22.451198 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.450975 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d04795cc-8a8c-4535-af5e-b8ffd62e22ee" containerName="main" Apr 22 19:07:22.454308 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.454290 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.456658 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.456635 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 22 19:07:22.456760 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.456635 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-5j9jq\"" Apr 22 19:07:22.464593 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.464537 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz"] Apr 22 19:07:22.562083 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.562054 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.562237 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.562095 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c532a52-bfce-4895-b02c-c544cc4df06d-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.562237 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.562151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.562312 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.562235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gqd\" (UniqueName: \"kubernetes.io/projected/2c532a52-bfce-4895-b02c-c544cc4df06d-kube-api-access-f7gqd\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.562312 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.562279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.562312 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.562295 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.663539 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.663511 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gqd\" (UniqueName: \"kubernetes.io/projected/2c532a52-bfce-4895-b02c-c544cc4df06d-kube-api-access-f7gqd\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.663673 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.663553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.663673 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.663571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.663673 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.663597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.663673 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.663631 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c532a52-bfce-4895-b02c-c544cc4df06d-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.663673 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.663655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.663973 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.663952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.664037 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.663985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.664037 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.664019 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.664119 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.664098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.666417 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.666399 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c532a52-bfce-4895-b02c-c544cc4df06d-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.672721 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.672700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gqd\" (UniqueName: \"kubernetes.io/projected/2c532a52-bfce-4895-b02c-c544cc4df06d-kube-api-access-f7gqd\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.766764 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.766699 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:22.891541 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:22.891352 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz"] Apr 22 19:07:22.893879 ip-10-0-129-26 kubenswrapper[2572]: W0422 19:07:22.893849 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c532a52_bfce_4895_b02c_c544cc4df06d.slice/crio-44cb723fb2fe0979eb388487e5b985709a2247d843ac274b04f4b3f74e51ad6b WatchSource:0}: Error finding container 44cb723fb2fe0979eb388487e5b985709a2247d843ac274b04f4b3f74e51ad6b: Status 404 returned error can't find the container with id 44cb723fb2fe0979eb388487e5b985709a2247d843ac274b04f4b3f74e51ad6b Apr 22 19:07:23.654809 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:23.654771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" event={"ID":"2c532a52-bfce-4895-b02c-c544cc4df06d","Type":"ContainerStarted","Data":"2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5"} Apr 22 19:07:23.655298 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:23.654816 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" event={"ID":"2c532a52-bfce-4895-b02c-c544cc4df06d","Type":"ContainerStarted","Data":"44cb723fb2fe0979eb388487e5b985709a2247d843ac274b04f4b3f74e51ad6b"} Apr 22 19:07:24.658896 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:24.658856 2572 generic.go:358] "Generic (PLEG): container finished" podID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerID="2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5" exitCode=0 Apr 22 19:07:24.659304 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:24.658938 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" event={"ID":"2c532a52-bfce-4895-b02c-c544cc4df06d","Type":"ContainerDied","Data":"2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5"} Apr 22 19:07:25.664797 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:25.664764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" event={"ID":"2c532a52-bfce-4895-b02c-c544cc4df06d","Type":"ContainerStarted","Data":"e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef"} Apr 22 19:07:25.664797 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:25.664802 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" event={"ID":"2c532a52-bfce-4895-b02c-c544cc4df06d","Type":"ContainerStarted","Data":"0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955"} Apr 22 19:07:25.665274 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:25.664912 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:25.688070 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:25.688017 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" podStartSLOduration=3.687998397 podStartE2EDuration="3.687998397s" podCreationTimestamp="2026-04-22 19:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:25.685717654 +0000 UTC m=+1800.289871663" watchObservedRunningTime="2026-04-22 19:07:25.687998397 +0000 UTC m=+1800.292152402" Apr 22 19:07:32.766848 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:32.766805 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:32.766848 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:32.766847 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:32.769538 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:32.769514 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:33.694185 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:33.694157 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:40.225434 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:40.225411 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 19:07:40.229883 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:40.229862 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 19:07:40.230023 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:40.229901 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 19:07:40.234317 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:40.234300 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 19:07:54.697841 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:54.697810 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:55.579902 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:55.579855 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz"] Apr 22 19:07:55.580406 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:55.580286 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerName="main" containerID="cri-o://0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955" gracePeriod=30 Apr 22 19:07:55.580664 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:55.580378 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerName="tokenizer" containerID="cri-o://e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef" gracePeriod=30 Apr 22 19:07:55.771045 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:55.771012 2572 generic.go:358] "Generic (PLEG): container finished" podID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerID="0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955" exitCode=0 Apr 22 19:07:55.771421 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:55.771091 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" event={"ID":"2c532a52-bfce-4895-b02c-c544cc4df06d","Type":"ContainerDied","Data":"0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955"} Apr 22 19:07:56.724195 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.724173 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:56.775741 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.775709 2572 generic.go:358] "Generic (PLEG): container finished" podID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerID="e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef" exitCode=0 Apr 22 19:07:56.776062 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.775764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" event={"ID":"2c532a52-bfce-4895-b02c-c544cc4df06d","Type":"ContainerDied","Data":"e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef"} Apr 22 19:07:56.776062 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.775785 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" Apr 22 19:07:56.776062 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.775790 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz" event={"ID":"2c532a52-bfce-4895-b02c-c544cc4df06d","Type":"ContainerDied","Data":"44cb723fb2fe0979eb388487e5b985709a2247d843ac274b04f4b3f74e51ad6b"} Apr 22 19:07:56.776062 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.775810 2572 scope.go:117] "RemoveContainer" containerID="e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef" Apr 22 19:07:56.783612 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.783597 2572 scope.go:117] "RemoveContainer" containerID="0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955" Apr 22 19:07:56.790692 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.790675 2572 scope.go:117] "RemoveContainer" containerID="2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5" Apr 22 19:07:56.797621 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.797579 2572 scope.go:117] "RemoveContainer" containerID="e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef" Apr 22 19:07:56.797838 ip-10-0-129-26 kubenswrapper[2572]: E0422 19:07:56.797821 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef\": container with ID starting with e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef not found: ID does not exist" containerID="e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef" Apr 22 19:07:56.797887 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.797846 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef"} err="failed to get container status \"e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef\": rpc error: code = NotFound desc = could not find container \"e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef\": container with ID starting with e866b922223f0fa1e8357845f11ca4f02569d39147d60b0ebe75d29593b278ef not found: ID does not exist" Apr 22 19:07:56.797887 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.797862 2572 scope.go:117] "RemoveContainer" containerID="0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955" Apr 22 19:07:56.798038 ip-10-0-129-26 kubenswrapper[2572]: E0422 19:07:56.798024 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955\": container with ID starting with 0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955 not found: ID does not exist" containerID="0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955" Apr 22 19:07:56.798078 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.798043 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955"} err="failed to get container status \"0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955\": rpc error: code = NotFound desc = could not find container \"0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955\": container with ID starting with 0c4df2ad353ea5369ee3749b5820cbb84b2f0e9010f5fb1f53e5193ff19bc955 not found: ID does not exist" Apr 22 19:07:56.798078 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.798056 2572 scope.go:117] "RemoveContainer" containerID="2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5" Apr 22 19:07:56.798284 ip-10-0-129-26 kubenswrapper[2572]: E0422 19:07:56.798264 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5\": container with ID starting with 2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5 not found: ID does not exist" containerID="2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5" Apr 22 19:07:56.798344 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.798293 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5"} err="failed to get container status \"2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5\": rpc error: code = NotFound desc = could not find container \"2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5\": container with ID starting with 2fb3a2f8cd03978cdcfb54d1dd04f3306d8fc9255b103ca465a3f3714f855af5 not found: ID does not exist" Apr 22 19:07:56.833759 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.833736 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c532a52-bfce-4895-b02c-c544cc4df06d-tls-certs\") pod \"2c532a52-bfce-4895-b02c-c544cc4df06d\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " Apr 22 19:07:56.833852 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.833806 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-tmp\") pod \"2c532a52-bfce-4895-b02c-c544cc4df06d\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " Apr 22 19:07:56.833899 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.833852 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-uds\") pod \"2c532a52-bfce-4895-b02c-c544cc4df06d\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " Apr 22 19:07:56.833940 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.833898 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7gqd\" (UniqueName: \"kubernetes.io/projected/2c532a52-bfce-4895-b02c-c544cc4df06d-kube-api-access-f7gqd\") pod \"2c532a52-bfce-4895-b02c-c544cc4df06d\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " Apr 22 19:07:56.833993 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.833950 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-cache\") pod \"2c532a52-bfce-4895-b02c-c544cc4df06d\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " Apr 22 19:07:56.833993 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.833977 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-kserve-provision-location\") pod \"2c532a52-bfce-4895-b02c-c544cc4df06d\" (UID: \"2c532a52-bfce-4895-b02c-c544cc4df06d\") " Apr 22 19:07:56.834248 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.834217 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2c532a52-bfce-4895-b02c-c544cc4df06d" (UID: "2c532a52-bfce-4895-b02c-c544cc4df06d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:56.834248 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.834231 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2c532a52-bfce-4895-b02c-c544cc4df06d" (UID: "2c532a52-bfce-4895-b02c-c544cc4df06d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:56.834384 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.834252 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2c532a52-bfce-4895-b02c-c544cc4df06d" (UID: "2c532a52-bfce-4895-b02c-c544cc4df06d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:56.835677 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.835645 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2c532a52-bfce-4895-b02c-c544cc4df06d" (UID: "2c532a52-bfce-4895-b02c-c544cc4df06d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:56.835869 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.835852 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c532a52-bfce-4895-b02c-c544cc4df06d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2c532a52-bfce-4895-b02c-c544cc4df06d" (UID: "2c532a52-bfce-4895-b02c-c544cc4df06d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:56.836301 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.836282 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c532a52-bfce-4895-b02c-c544cc4df06d-kube-api-access-f7gqd" (OuterVolumeSpecName: "kube-api-access-f7gqd") pod "2c532a52-bfce-4895-b02c-c544cc4df06d" (UID: "2c532a52-bfce-4895-b02c-c544cc4df06d"). InnerVolumeSpecName "kube-api-access-f7gqd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:07:56.934776 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.934753 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-tmp\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.934776 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.934775 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-uds\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.934937 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.934784 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f7gqd\" (UniqueName: \"kubernetes.io/projected/2c532a52-bfce-4895-b02c-c544cc4df06d-kube-api-access-f7gqd\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.934937 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.934793 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-tokenizer-cache\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.934937 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.934802 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c532a52-bfce-4895-b02c-c544cc4df06d-kserve-provision-location\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.934937 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:56.934810 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c532a52-bfce-4895-b02c-c544cc4df06d-tls-certs\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:07:57.101236 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:57.101209 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz"] Apr 22 19:07:57.105371 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:57.105350 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7d585ggbjz"] Apr 22 19:07:57.869421 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:07:57.869388 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" path="/var/lib/kubelet/pods/2c532a52-bfce-4895-b02c-c544cc4df06d/volumes" Apr 22 19:08:57.623091 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:57.623007 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv"] Apr 22 19:08:57.623592 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:57.623426 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="main" containerID="cri-o://a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a" gracePeriod=30 Apr 22 19:08:57.623592 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:57.623481 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="tokenizer" containerID="cri-o://dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8" gracePeriod=30 Apr 22 19:08:57.979592 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:57.979506 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerID="a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a" exitCode=0 Apr 22 19:08:57.979732 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:57.979581 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" event={"ID":"ce64622d-c974-44a7-97a3-798b5cfb7fdc","Type":"ContainerDied","Data":"a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a"} Apr 22 19:08:58.505498 ip-10-0-129-26 kubenswrapper[2572]: W0422 19:08:58.505455 2572 logging.go:55] [core] [Channel #372 SubChannel #373]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.32:9003", ServerName: "10.132.0.32:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.32:9003: connect: connection refused" Apr 22 19:08:58.764659 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.764605 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:08:58.807570 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.807547 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-cache\") pod \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " Apr 22 19:08:58.807719 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.807582 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kserve-provision-location\") pod \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " Apr 22 19:08:58.807719 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.807615 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-tmp\") pod \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " Apr 22 19:08:58.807719 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.807635 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tls-certs\") pod \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " Apr 22 19:08:58.807719 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.807662 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8rwj\" (UniqueName: \"kubernetes.io/projected/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kube-api-access-w8rwj\") pod \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " Apr 22 19:08:58.807719 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.807696 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-uds\") pod \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\" (UID: \"ce64622d-c974-44a7-97a3-798b5cfb7fdc\") " Apr 22 19:08:58.807983 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.807848 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ce64622d-c974-44a7-97a3-798b5cfb7fdc" (UID: "ce64622d-c974-44a7-97a3-798b5cfb7fdc"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:08:58.807983 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.807948 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-cache\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:08:58.808087 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.808051 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ce64622d-c974-44a7-97a3-798b5cfb7fdc" (UID: "ce64622d-c974-44a7-97a3-798b5cfb7fdc"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:08:58.808087 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.808052 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ce64622d-c974-44a7-97a3-798b5cfb7fdc" (UID: "ce64622d-c974-44a7-97a3-798b5cfb7fdc"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:08:58.808774 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.808743 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ce64622d-c974-44a7-97a3-798b5cfb7fdc" (UID: "ce64622d-c974-44a7-97a3-798b5cfb7fdc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:08:58.809813 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.809793 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ce64622d-c974-44a7-97a3-798b5cfb7fdc" (UID: "ce64622d-c974-44a7-97a3-798b5cfb7fdc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:08:58.809887 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.809866 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kube-api-access-w8rwj" (OuterVolumeSpecName: "kube-api-access-w8rwj") pod "ce64622d-c974-44a7-97a3-798b5cfb7fdc" (UID: "ce64622d-c974-44a7-97a3-798b5cfb7fdc"). InnerVolumeSpecName "kube-api-access-w8rwj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:08:58.909242 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.909219 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kserve-provision-location\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:08:58.909242 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.909241 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-tmp\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:08:58.909376 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.909251 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tls-certs\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:08:58.909376 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.909259 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8rwj\" (UniqueName: \"kubernetes.io/projected/ce64622d-c974-44a7-97a3-798b5cfb7fdc-kube-api-access-w8rwj\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:08:58.909376 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.909268 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ce64622d-c974-44a7-97a3-798b5cfb7fdc-tokenizer-uds\") on node \"ip-10-0-129-26.ec2.internal\" DevicePath \"\"" Apr 22 19:08:58.988099 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.988072 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerID="dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8" exitCode=0 Apr 22 19:08:58.988263 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.988138 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" event={"ID":"ce64622d-c974-44a7-97a3-798b5cfb7fdc","Type":"ContainerDied","Data":"dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8"} Apr 22 19:08:58.988263 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.988166 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" Apr 22 19:08:58.988263 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.988174 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" event={"ID":"ce64622d-c974-44a7-97a3-798b5cfb7fdc","Type":"ContainerDied","Data":"d903c003d0864f3cfea204f13b9c8ed6dbc632d55467b63872aa29643d836c60"} Apr 22 19:08:58.988263 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.988190 2572 scope.go:117] "RemoveContainer" containerID="dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8" Apr 22 19:08:58.997664 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:58.997646 2572 scope.go:117] "RemoveContainer" containerID="a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a" Apr 22 19:08:59.004951 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.004936 2572 scope.go:117] "RemoveContainer" containerID="9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e" Apr 22 19:08:59.012653 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.012632 2572 scope.go:117] "RemoveContainer" containerID="dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8" Apr 22 19:08:59.012905 ip-10-0-129-26 kubenswrapper[2572]: E0422 19:08:59.012886 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8\": container with ID starting with dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8 not found: ID does not exist" containerID="dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8" Apr 22 19:08:59.012979 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.012912 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8"} err="failed to get container status \"dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8\": rpc error: code = NotFound desc = could not find container \"dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8\": container with ID starting with dfc778b458d7d5f54ea66e23ffd78bbb3bdcc4715db9aea1ef0e2ef2b0c782e8 not found: ID does not exist" Apr 22 19:08:59.012979 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.012929 2572 scope.go:117] "RemoveContainer" containerID="a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a" Apr 22 19:08:59.013165 ip-10-0-129-26 kubenswrapper[2572]: E0422 19:08:59.013117 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a\": container with ID starting with a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a not found: ID does not exist" containerID="a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a" Apr 22 19:08:59.013259 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.013170 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a"} err="failed to get container status \"a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a\": rpc error: code = NotFound desc = could not find container \"a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a\": container with ID starting with a6b839f7a09e21faf043a3ff618ee1b300641cd919ea9d85b54835b5b6c00b7a not found: ID does not exist" Apr 22 19:08:59.013259 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.013185 2572 scope.go:117] "RemoveContainer" containerID="9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e" Apr 22 19:08:59.013503 ip-10-0-129-26 kubenswrapper[2572]: E0422 19:08:59.013439 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e\": container with ID starting with 9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e not found: ID does not exist" containerID="9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e" Apr 22 19:08:59.013503 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.013473 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e"} err="failed to get container status \"9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e\": rpc error: code = NotFound desc = could not find container \"9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e\": container with ID starting with 9ffa1b75230c123b286a433bd013738effab30b7174cd7ea69e8b3f39a6e1b0e not found: ID does not exist" Apr 22 19:08:59.015636 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.015591 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv"] Apr 22 19:08:59.024545 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.024517 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv"] Apr 22 19:08:59.505417 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.505375 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5b98c79789fhlv" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.32:9003\" within 1s: context deadline exceeded" Apr 22 19:08:59.868532 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:08:59.868500 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" path="/var/lib/kubelet/pods/ce64622d-c974-44a7-97a3-798b5cfb7fdc/volumes" Apr 22 19:09:30.252161 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:30.252102 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-wwkv9_63cc8843-8d69-4cab-87bc-be1697b84e55/manager/0.log" Apr 22 19:09:30.313316 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:30.313298 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-6dzjk_e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3/manager/0.log" Apr 22 19:09:30.322158 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:30.322138 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-qmgx9_6c3bc178-8c72-49a8-bffd-ae035016673f/limitador/0.log" Apr 22 19:09:36.165151 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:36.165097 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-k99dx_af564c1a-d59c-4ed2-ae28-bef2ffad4877/global-pull-secret-syncer/0.log" Apr 22 19:09:36.265093 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:36.265064 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vtzpc_892816cf-4bb8-447c-8d94-74a892f6bd4c/konnectivity-agent/0.log" Apr 22 19:09:36.287929 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:36.287902 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-26.ec2.internal_172d194e4a77f6eb5607ce1c62ccac15/haproxy/0.log" Apr 22 19:09:40.454166 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:40.454138 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-wwkv9_63cc8843-8d69-4cab-87bc-be1697b84e55/manager/0.log" Apr 22 19:09:40.538074 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:40.538049 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-6dzjk_e4b0f1ed-3a1b-48c7-8d62-dc137f3dcdb3/manager/0.log" Apr 22 19:09:40.560852 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:40.560809 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-qmgx9_6c3bc178-8c72-49a8-bffd-ae035016673f/limitador/0.log" Apr 22 19:09:41.877917 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:41.877891 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-6xkjz_e91dd8ff-b33c-47eb-ac7c-336a3bb6f08a/monitoring-plugin/0.log" Apr 22 19:09:41.909491 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:41.909470 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c9jfr_0088099a-293c-492a-a62d-667aec16a6a4/node-exporter/0.log" Apr 22 19:09:41.932871 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:41.932838 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c9jfr_0088099a-293c-492a-a62d-667aec16a6a4/kube-rbac-proxy/0.log" Apr 22 19:09:41.955729 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:41.955711 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c9jfr_0088099a-293c-492a-a62d-667aec16a6a4/init-textfile/0.log" Apr 22 19:09:42.250075 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:42.250052 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f26585cf-02f2-4a4c-81f8-febebc4df42e/prometheus/0.log" Apr 22 19:09:42.289481 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:42.289460 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f26585cf-02f2-4a4c-81f8-febebc4df42e/config-reloader/0.log" Apr 22 19:09:42.316216 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:42.316190 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f26585cf-02f2-4a4c-81f8-febebc4df42e/thanos-sidecar/0.log" Apr 22 19:09:42.337603 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:42.337563 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f26585cf-02f2-4a4c-81f8-febebc4df42e/kube-rbac-proxy-web/0.log" Apr 22 19:09:42.374667 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:42.374653 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f26585cf-02f2-4a4c-81f8-febebc4df42e/kube-rbac-proxy/0.log" Apr 22 19:09:42.398861 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:42.398838 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f26585cf-02f2-4a4c-81f8-febebc4df42e/kube-rbac-proxy-thanos/0.log" Apr 22 19:09:42.428845 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:42.428826 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f26585cf-02f2-4a4c-81f8-febebc4df42e/init-config-reloader/0.log" Apr 22 19:09:42.471450 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:42.471433 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-bz2fw_950d0214-25c1-486e-ad90-4c388b4430d2/prometheus-operator/0.log" Apr 22 19:09:42.495242 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:42.495225 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-bz2fw_950d0214-25c1-486e-ad90-4c388b4430d2/kube-rbac-proxy/0.log" Apr 22 19:09:42.525612 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:42.525575 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-ghg5j_13f6631c-c5a4-46fe-bb58-94d757f98e9a/prometheus-operator-admission-webhook/0.log" Apr 22 19:09:44.637486 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:44.637445 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/1.log" Apr 22 19:09:44.641819 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:44.641798 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-6f6sj_2508d1bd-b2cc-4edf-8bd2-34590e71e3e0/console-operator/2.log" Apr 22 19:09:45.357711 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.357677 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs"] Apr 22 19:09:45.357979 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.357967 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerName="main" Apr 22 19:09:45.358024 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.357981 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerName="main" Apr 22 19:09:45.358024 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.357994 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerName="storage-initializer" Apr 22 19:09:45.358024 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358001 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerName="storage-initializer" Apr 22 19:09:45.358024 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358007 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="main" Apr 22 19:09:45.358024 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358012 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="main" Apr 22 19:09:45.358024 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358018 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="tokenizer" Apr 22 19:09:45.358024 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358023 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="tokenizer" Apr 22 19:09:45.358258 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358034 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="storage-initializer" Apr 22 19:09:45.358258 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358040 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="storage-initializer" Apr 22 19:09:45.358258 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358048 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerName="tokenizer" Apr 22 19:09:45.358258 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358053 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerName="tokenizer" Apr 22 19:09:45.358258 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358118 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerName="tokenizer" Apr 22 19:09:45.358258 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358144 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="tokenizer" Apr 22 19:09:45.358258 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358154 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c532a52-bfce-4895-b02c-c544cc4df06d" containerName="main" Apr 22 19:09:45.358258 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.358163 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce64622d-c974-44a7-97a3-798b5cfb7fdc" containerName="main" Apr 22 19:09:45.362446 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.362416 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.364655 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.364630 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nmkll\"/\"kube-root-ca.crt\"" Apr 22 19:09:45.365576 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.365535 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nmkll\"/\"default-dockercfg-vxhkm\"" Apr 22 19:09:45.365744 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.365538 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nmkll\"/\"openshift-service-ca.crt\"" Apr 22 19:09:45.370258 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.370236 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs"] Apr 22 19:09:45.452837 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.452812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-sys\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.452953 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.452842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-lib-modules\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.452953 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.452868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdtsz\" (UniqueName: \"kubernetes.io/projected/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-kube-api-access-bdtsz\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.452953 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.452896 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-podres\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.453065 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.452951 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-proc\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.553626 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.553602 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdtsz\" (UniqueName: \"kubernetes.io/projected/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-kube-api-access-bdtsz\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.553731 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.553633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-podres\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.553731 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.553672 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-proc\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.553731 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.553697 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-sys\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.553731 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.553714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-lib-modules\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.553909 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.553773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-podres\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.553909 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.553812 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-lib-modules\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.553909 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.553818 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-proc\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.553909 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.553836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-sys\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.562565 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.562541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdtsz\" (UniqueName: \"kubernetes.io/projected/bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46-kube-api-access-bdtsz\") pod \"perf-node-gather-daemonset-kswxs\" (UID: \"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.673326 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.673270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:45.799069 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.799035 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs"] Apr 22 19:09:45.801642 ip-10-0-129-26 kubenswrapper[2572]: W0422 19:09:45.801613 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbdd91891_67ec_4e1d_a5c3_a3d1bccc6b46.slice/crio-0fc5a22a3207e08458a229665f57e5a307f8a2bd5c07078dc68a4cc8323395e4 WatchSource:0}: Error finding container 0fc5a22a3207e08458a229665f57e5a307f8a2bd5c07078dc68a4cc8323395e4: Status 404 returned error can't find the container with id 0fc5a22a3207e08458a229665f57e5a307f8a2bd5c07078dc68a4cc8323395e4 Apr 22 19:09:45.803359 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:45.803340 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:09:46.133627 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:46.133597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" event={"ID":"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46","Type":"ContainerStarted","Data":"febe2ef59f935bf3094225aba00d8723ca0497cfdf359dff9937ef2c2cee9eaa"} Apr 22 19:09:46.133793 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:46.133631 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" event={"ID":"bdd91891-67ec-4e1d-a5c3-a3d1bccc6b46","Type":"ContainerStarted","Data":"0fc5a22a3207e08458a229665f57e5a307f8a2bd5c07078dc68a4cc8323395e4"} Apr 22 19:09:46.133793 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:46.133657 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:46.149408 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:46.149372 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" podStartSLOduration=1.1493589 podStartE2EDuration="1.1493589s" podCreationTimestamp="2026-04-22 19:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:09:46.148700887 +0000 UTC m=+1940.752854888" watchObservedRunningTime="2026-04-22 19:09:46.1493589 +0000 UTC m=+1940.753512964" Apr 22 19:09:46.424720 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:46.424657 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-45l64_8628c438-3340-49c9-bbba-e74cdc0b5926/dns/0.log" Apr 22 19:09:46.445318 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:46.445294 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-45l64_8628c438-3340-49c9-bbba-e74cdc0b5926/kube-rbac-proxy/0.log" Apr 22 19:09:46.590830 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:46.590805 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7svrr_482a486e-af9c-4b33-8137-5fbd716f3fc2/dns-node-resolver/0.log" Apr 22 19:09:47.141900 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:47.141876 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nx5jx_d748c1b5-2936-4573-b185-b6e9da61d104/node-ca/0.log" Apr 22 19:09:48.539336 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:48.539306 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-srk76_700dd0a8-270f-4acc-bbc0-dc7d43241d5f/serve-healthcheck-canary/0.log" Apr 22 19:09:49.119740 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:49.119689 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qn2jw_355b46cc-3542-4cb1-a685-7b6485a4c874/kube-rbac-proxy/0.log" Apr 22 19:09:49.139548 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:49.139525 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qn2jw_355b46cc-3542-4cb1-a685-7b6485a4c874/exporter/0.log" Apr 22 19:09:49.160091 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:49.160054 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qn2jw_355b46cc-3542-4cb1-a685-7b6485a4c874/extractor/0.log" Apr 22 19:09:51.788409 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:51.788362 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-59b94d4c58-5nzpd_631657e2-8aea-48cd-b04c-05ad688d76da/manager/0.log" Apr 22 19:09:51.838940 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:51.838915 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-h72t6_86957f8e-9ab0-4e50-b65d-3b12fff12553/openshift-lws-operator/0.log" Apr 22 19:09:52.146363 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:52.146342 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-kswxs" Apr 22 19:09:52.426145 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:52.426066 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-7c8976c8d9-8qzsc_5ff36bed-0e8c-4dd6-88f8-08572cf682f1/manager/0.log" Apr 22 19:09:58.027995 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:58.027956 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-459cl_549f6ace-6f8f-4b1a-ab20-7550de5af2f4/kube-storage-version-migrator-operator/1.log" Apr 22 19:09:58.028817 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:58.028797 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-459cl_549f6ace-6f8f-4b1a-ab20-7550de5af2f4/kube-storage-version-migrator-operator/0.log" Apr 22 19:09:59.263526 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:59.263496 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9md8j_5b646830-7c62-445d-8fa2-df72174ec0c3/kube-multus-additional-cni-plugins/0.log" Apr 22 19:09:59.285896 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:59.285874 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9md8j_5b646830-7c62-445d-8fa2-df72174ec0c3/egress-router-binary-copy/0.log" Apr 22 19:09:59.306759 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:59.306731 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9md8j_5b646830-7c62-445d-8fa2-df72174ec0c3/cni-plugins/0.log" Apr 22 19:09:59.329572 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:59.329552 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9md8j_5b646830-7c62-445d-8fa2-df72174ec0c3/bond-cni-plugin/0.log" Apr 22 19:09:59.354461 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:59.354438 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9md8j_5b646830-7c62-445d-8fa2-df72174ec0c3/routeoverride-cni/0.log" Apr 22 19:09:59.375206 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:59.375189 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9md8j_5b646830-7c62-445d-8fa2-df72174ec0c3/whereabouts-cni-bincopy/0.log" Apr 22 19:09:59.395891 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:59.395868 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9md8j_5b646830-7c62-445d-8fa2-df72174ec0c3/whereabouts-cni/0.log" Apr 22 19:09:59.590960 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:59.590933 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f8f9s_ecd4872f-6321-4b86-9954-d839a3dff00d/kube-multus/0.log" Apr 22 19:09:59.681500 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:59.681456 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9cw24_3445bb02-600f-40d2-95c6-d4616e5630c5/network-metrics-daemon/0.log" Apr 22 19:09:59.700435 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:09:59.700407 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9cw24_3445bb02-600f-40d2-95c6-d4616e5630c5/kube-rbac-proxy/0.log" Apr 22 19:10:01.231989 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:10:01.231953 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-controller/0.log" Apr 22 19:10:01.253108 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:10:01.253083 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/0.log" Apr 22 19:10:01.262394 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:10:01.262372 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovn-acl-logging/1.log" Apr 22 19:10:01.282989 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:10:01.282967 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/kube-rbac-proxy-node/0.log" Apr 22 19:10:01.303244 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:10:01.303224 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:10:01.325257 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:10:01.325236 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/northd/0.log" Apr 22 19:10:01.345904 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:10:01.345882 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/nbdb/0.log" Apr 22 19:10:01.366649 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:10:01.366631 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/sbdb/0.log" Apr 22 19:10:01.463585 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:10:01.463568 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gwnpr_f31d871d-61d4-4d54-a317-0264d5e9f432/ovnkube-controller/0.log" Apr 22 19:10:02.625765 ip-10-0-129-26 kubenswrapper[2572]: I0422 19:10:02.625733 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4fnlv_2bfdae94-a7e1-48cd-8cad-b09cbb620c26/network-check-target-container/0.log"