Apr 22 17:34:11.262692 ip-10-0-143-54 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:34:11.729181 ip-10-0-143-54 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:11.729181 ip-10-0-143-54 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:34:11.729181 ip-10-0-143-54 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:11.729181 ip-10-0-143-54 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:34:11.729181 ip-10-0-143-54 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:11.729985 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.729899 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:34:11.732919 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732903 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:11.732919 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732919 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:11.732919 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732923 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732926 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732929 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732932 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732936 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732938 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732941 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732944 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732947 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732950 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732957 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732960 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732963 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732965 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732968 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732970 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732973 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732976 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732978 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732981 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:11.733019 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732983 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732986 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732988 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732990 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732993 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732996 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.732998 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733001 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733003 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733006 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733008 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733011 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733013 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733017 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733022 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733026 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733029 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733032 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733034 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733037 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:11.733508 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733040 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733044 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733048 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733051 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733054 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733057 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733060 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733062 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733065 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733068 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733071 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733074 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733076 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733079 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733082 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733084 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733087 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733090 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733092 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:11.733992 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733095 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733098 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733101 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733103 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733106 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733114 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733116 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733120 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733123 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733125 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733128 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733130 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733133 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733135 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733138 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733141 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733144 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733147 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733149 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733153 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:11.734514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733156 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733160 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733164 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733167 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733169 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733579 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733584 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733587 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733589 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733592 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733595 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733598 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733601 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733604 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733607 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733611 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733614 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733617 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733619 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:11.734998 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733622 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733625 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733628 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733630 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733633 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733636 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733638 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733641 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733643 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733646 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733648 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733651 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733654 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733656 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733658 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733661 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733664 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733667 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733669 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733671 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:11.735477 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733674 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733676 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733679 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733681 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733685 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733687 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733690 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733692 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733695 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733697 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733699 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733702 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733706 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733709 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733711 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733714 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733716 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733719 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733721 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733724 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:11.735970 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733726 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733729 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733731 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733734 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733736 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733739 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733741 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733744 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733746 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733749 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733751 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733754 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733756 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733759 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733761 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733764 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733766 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733769 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733771 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:11.736473 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733774 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733778 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733782 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733784 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733787 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733790 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733793 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733797 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733799 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733802 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733804 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733807 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.733809 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734385 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734393 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734399 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734403 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734408 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734411 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734415 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734433 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:34:11.737023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734436 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734439 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734443 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734446 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734450 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734452 2578 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734455 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734458 2578 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734461 2578 flags.go:64] FLAG: --cloud-config="" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734464 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734467 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734472 2578 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734475 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734478 2578 flags.go:64] FLAG: --config-dir="" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734481 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734484 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734492 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734496 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734499 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734503 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734506 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734509 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734512 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734515 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734518 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:34:11.737564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734523 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734526 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734529 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734532 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734535 2578 flags.go:64] FLAG: --enable-server="true" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734538 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734543 2578 flags.go:64] FLAG: --event-burst="100" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734546 2578 flags.go:64] FLAG: --event-qps="50" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734549 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734552 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734554 2578 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734558 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734561 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734564 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734567 2578 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734570 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734573 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734576 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734579 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734581 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734584 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734587 2578 flags.go:64] FLAG: --feature-gates="" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734591 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734594 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734597 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:34:11.738180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734601 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734604 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734606 2578 flags.go:64] FLAG: --help="false" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734609 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-143-54.ec2.internal" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734612 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734615 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734620 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734623 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734627 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734630 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734632 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734635 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734638 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734641 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734645 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734648 2578 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734651 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734654 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734657 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734659 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734662 2578 flags.go:64] FLAG: --lock-file="" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734665 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734668 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734671 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:34:11.738799 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734676 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734679 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734682 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734685 2578 flags.go:64] FLAG: --logging-format="text" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734687 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734691 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734694 2578 flags.go:64] FLAG: --manifest-url="" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734697 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734702 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734708 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734712 2578 flags.go:64] FLAG: --max-pods="110" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734715 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734719 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734722 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734726 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734729 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734732 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734734 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734742 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734745 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734748 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734751 2578 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734754 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:34:11.739438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734759 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734762 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734765 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734768 2578 flags.go:64] FLAG: --port="10250" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734772 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734775 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0fa2fe023ecbf6bcc" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734778 2578 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734781 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734784 2578 flags.go:64] FLAG: --register-node="true" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734787 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734790 2578 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734794 2578 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734797 2578 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734800 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734803 2578 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734806 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734809 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734813 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734815 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734819 2578 flags.go:64] FLAG: --runonce="false" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734822 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734825 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734828 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734832 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734834 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734837 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:34:11.740000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734841 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734844 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734846 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734849 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734852 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734855 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734858 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734861 2578 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734864 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734869 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734872 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734875 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734879 2578 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734882 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734885 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734887 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734890 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734893 2578 flags.go:64] FLAG: --v="2" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734897 2578 flags.go:64] FLAG: --version="false" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734901 2578 flags.go:64] FLAG: --vmodule="" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734905 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.734908 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735050 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735054 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:11.740650 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735057 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735060 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735063 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735066 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735069 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735073 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735076 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735079 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735081 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735084 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735086 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735089 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735091 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735094 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735097 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735101 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735103 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735107 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735111 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:11.741224 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735114 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735116 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735119 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735122 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735125 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735127 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735130 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735132 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735135 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735137 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735140 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735142 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735145 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735147 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735150 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735153 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735157 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735160 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735165 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735167 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:11.741711 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735170 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735172 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735175 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735177 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735180 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735182 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735185 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735188 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735191 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735194 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735196 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735199 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735201 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735204 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735207 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735209 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735212 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735215 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735218 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:11.742214 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735220 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735223 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735225 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735228 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735231 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735233 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735236 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735238 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735241 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735243 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735251 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735255 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735258 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735260 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735263 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735265 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735268 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735270 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735273 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735275 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:11.742734 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735278 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735282 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735285 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735287 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735290 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.735292 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.735297 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.741704 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.741718 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741774 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741780 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741784 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741787 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741791 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741794 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:11.743230 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741796 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741799 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741802 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741804 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741807 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741809 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741812 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741815 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741818 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741821 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741823 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741826 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741828 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741831 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741834 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741836 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741838 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741841 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741843 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741846 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:11.743688 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741848 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741851 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741853 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741856 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741859 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741862 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741865 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741868 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741871 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741873 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741876 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741878 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741881 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741883 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741886 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741888 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741891 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741893 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741896 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741898 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:11.744168 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741902 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741904 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741907 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741910 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741912 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741915 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741917 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741919 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741922 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741924 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741927 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741929 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741932 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741937 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741940 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741942 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741945 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741948 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741951 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741953 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:11.744685 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741956 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741958 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741961 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741963 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741966 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741970 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741974 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741977 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741980 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741983 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741986 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741988 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741991 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741995 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.741997 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742000 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742003 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742005 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742008 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:11.745164 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742010 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.742015 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742109 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742114 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742117 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742120 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742123 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742126 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742128 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742131 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742134 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742137 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742140 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742142 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742144 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742147 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:11.745631 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742150 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742152 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742155 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742157 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742160 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742162 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742164 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742167 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742170 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742172 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742175 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742178 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742180 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742183 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742185 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742187 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742190 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742192 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742195 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:11.746027 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742198 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742200 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742203 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742205 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742208 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742210 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742212 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742215 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742218 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742221 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742224 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742227 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742229 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742232 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742235 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742237 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742240 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742242 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742244 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742247 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:11.746504 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742249 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742252 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742254 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742257 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742259 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742262 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742265 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742267 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742270 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742272 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742275 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742278 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742280 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742282 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742286 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742289 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742292 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742296 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742299 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:11.746990 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742302 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742304 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742307 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742310 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742312 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742316 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742319 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742322 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742325 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742327 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742330 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742332 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742334 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:11.742337 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.742342 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:11.747478 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.743008 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:34:11.747845 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.745325 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:34:11.747845 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.746370 2578 server.go:1019] "Starting client certificate rotation" Apr 22 17:34:11.747845 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.746458 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:11.747845 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.746496 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:11.774175 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.774155 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:11.778866 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.778841 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:11.791792 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.791771 2578 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:34:11.797942 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.797927 2578 log.go:25] "Validated CRI v1 image API" Apr 22 17:34:11.799145 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.799122 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:34:11.801893 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.801877 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:11.807574 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.807551 2578 fs.go:135] Filesystem UUIDs: map[6ae81b64-cf2d-4779-8b53-6e6d32948d60:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9fb96828-e8e0-4eac-8d1a-db7cbbfbec08:/dev/nvme0n1p4] Apr 22 17:34:11.807633 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.807574 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:34:11.813909 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.813794 2578 manager.go:217] Machine: {Timestamp:2026-04-22 17:34:11.81179933 +0000 UTC m=+0.427891593 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101170 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec281fa2da432216a178cb519b514a19 SystemUUID:ec281fa2-da43-2216-a178-cb519b514a19 BootID:ef2bad71-a06b-48a4-843d-c78e79be8da7 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ca:0e:80:95:6d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ca:0e:80:95:6d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:12:63:1e:d1:c9:69 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:34:11.813909 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.813905 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:34:11.814055 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.814014 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:34:11.815132 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.815106 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:34:11.815268 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.815134 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-54.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:34:11.815313 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.815277 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:34:11.815313 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.815286 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:34:11.815313 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.815299 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:11.816276 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.816265 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:11.817467 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.817457 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:11.817730 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.817720 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:34:11.817923 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.817910 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8mcxc" Apr 22 17:34:11.820238 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.820228 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:34:11.820277 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.820247 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:34:11.820277 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.820259 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:34:11.820277 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.820268 2578 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:34:11.820277 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.820276 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:34:11.821328 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.821315 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:11.821369 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.821344 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:11.824549 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.824526 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:34:11.825995 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.825968 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8mcxc" Apr 22 17:34:11.826297 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.826284 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:34:11.828159 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828144 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828163 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828170 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828176 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828182 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828187 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828193 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828198 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828205 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828211 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828220 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:34:11.828234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.828229 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:34:11.829809 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.829793 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:34:11.829887 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.829812 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:34:11.832760 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.832747 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:11.833460 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.833447 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:34:11.833512 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.833483 2578 server.go:1295] "Started kubelet" Apr 22 17:34:11.833624 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.833571 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:34:11.833624 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.833578 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:34:11.833683 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.833647 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:34:11.834417 ip-10-0-143-54 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:34:11.834923 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.834664 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:11.835312 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.835272 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:34:11.836541 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.836524 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:34:11.836713 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.836695 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-54.ec2.internal" not found Apr 22 17:34:11.841601 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.841582 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:11.842166 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.842149 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:34:11.842941 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.842924 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:34:11.842941 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.842924 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:34:11.843116 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.842949 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:34:11.843116 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.843063 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:34:11.843116 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.843071 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:34:11.843116 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:11.843112 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-54.ec2.internal\" not found" Apr 22 17:34:11.843302 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.843201 2578 factory.go:55] Registering systemd factory Apr 22 17:34:11.843302 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.843274 2578 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:34:11.844242 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.844229 2578 factory.go:153] Registering CRI-O factory Apr 22 17:34:11.844330 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.844321 2578 factory.go:223] Registration of the crio container factory successfully Apr 22 17:34:11.844464 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.844360 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:11.844566 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.844548 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:34:11.844674 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.844578 2578 factory.go:103] Registering Raw factory Apr 22 17:34:11.844674 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.844599 2578 manager.go:1196] Started watching for new ooms in manager Apr 22 17:34:11.845055 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.845039 2578 manager.go:319] Starting recovery of all containers Apr 22 17:34:11.845248 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:11.845222 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:34:11.847093 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:11.847072 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-54.ec2.internal\" not found" node="ip-10-0-143-54.ec2.internal" Apr 22 17:34:11.851144 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.851119 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-54.ec2.internal" not found Apr 22 17:34:11.856056 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.856036 2578 manager.go:324] Recovery completed Apr 22 17:34:11.859988 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.859977 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:11.861714 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.861699 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-54.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:11.861789 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.861727 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-54.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:11.861789 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.861737 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-54.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:11.862215 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.862203 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:34:11.862215 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.862214 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:34:11.862290 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.862232 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:11.864543 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.864532 2578 policy_none.go:49] "None policy: Start" Apr 22 17:34:11.864578 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.864547 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:34:11.864578 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.864557 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:34:11.894977 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.894961 2578 manager.go:341] "Starting Device Plugin manager" Apr 22 17:34:11.895049 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:11.895038 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:34:11.895095 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.895052 2578 server.go:85] "Starting device plugin registration server" Apr 22 17:34:11.895269 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.895259 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:34:11.895307 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.895271 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:34:11.895575 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.895386 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:34:11.895575 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.895498 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:34:11.895575 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.895509 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:34:11.896041 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:11.896013 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:34:11.896111 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:11.896055 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-54.ec2.internal\" not found" Apr 22 17:34:11.910320 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.910306 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-54.ec2.internal" not found Apr 22 17:34:11.949347 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.949310 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:34:11.950529 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.950511 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:34:11.950642 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.950536 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:34:11.950642 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.950557 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:34:11.950642 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.950563 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:34:11.950642 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:11.950595 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:34:11.952758 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.952741 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:11.995669 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.995618 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:11.996789 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.996777 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-54.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:11.996881 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.996800 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-54.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:11.996881 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.996810 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-54.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:11.996881 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:11.996832 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.005929 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.005910 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.051514 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.051488 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal"] Apr 22 17:34:12.054301 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.054283 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.054394 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.054286 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.079092 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.079075 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.083584 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.083568 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.090276 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.090261 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:12.103898 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.103879 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:12.145101 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.145073 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/76b662887d5a0620f946a7b7cdcbb27a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal\" (UID: \"76b662887d5a0620f946a7b7cdcbb27a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.145211 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.145113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76b662887d5a0620f946a7b7cdcbb27a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal\" (UID: \"76b662887d5a0620f946a7b7cdcbb27a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.145211 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.145137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1346330c188515bccf6ba5d0ec452e1d-config\") pod \"kube-apiserver-proxy-ip-10-0-143-54.ec2.internal\" (UID: \"1346330c188515bccf6ba5d0ec452e1d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.245519 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.245486 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/76b662887d5a0620f946a7b7cdcbb27a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal\" (UID: \"76b662887d5a0620f946a7b7cdcbb27a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.245519 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.245519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76b662887d5a0620f946a7b7cdcbb27a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal\" (UID: \"76b662887d5a0620f946a7b7cdcbb27a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.245718 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.245558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1346330c188515bccf6ba5d0ec452e1d-config\") pod \"kube-apiserver-proxy-ip-10-0-143-54.ec2.internal\" (UID: \"1346330c188515bccf6ba5d0ec452e1d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.245718 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.245586 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/76b662887d5a0620f946a7b7cdcbb27a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal\" (UID: \"76b662887d5a0620f946a7b7cdcbb27a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.245718 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.245612 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76b662887d5a0620f946a7b7cdcbb27a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal\" (UID: \"76b662887d5a0620f946a7b7cdcbb27a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.245718 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.245609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1346330c188515bccf6ba5d0ec452e1d-config\") pod \"kube-apiserver-proxy-ip-10-0-143-54.ec2.internal\" (UID: \"1346330c188515bccf6ba5d0ec452e1d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.393229 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.393205 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.407713 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.407693 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" Apr 22 17:34:12.747157 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.747131 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:34:12.747805 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.747297 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:12.747805 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.747309 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:12.747805 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.747309 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:12.820847 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.820818 2578 apiserver.go:52] "Watching apiserver" Apr 22 17:34:12.827529 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.827485 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:29:11 +0000 UTC" deadline="2027-09-26 11:25:32.035078065 +0000 UTC" Apr 22 17:34:12.827529 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.827526 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12521h51m19.207556021s" Apr 22 17:34:12.830354 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.830337 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:34:12.830695 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.830674 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-6mrgr","openshift-image-registry/node-ca-hspjm","openshift-multus/multus-tg2s7","openshift-multus/network-metrics-daemon-fd4w6","openshift-network-diagnostics/network-check-target-69d82","openshift-network-operator/iptables-alerter-wpwkz","openshift-ovn-kubernetes/ovnkube-node-2b8mb","kube-system/konnectivity-agent-jb9vl","kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx","openshift-dns/node-resolver-cznmj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal","openshift-multus/multus-additional-cni-plugins-z45jm"] Apr 22 17:34:12.833571 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.833550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.834730 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.834708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:12.835730 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.835706 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.836104 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.836026 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:12.836104 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.836032 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:12.836104 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.836066 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jnsbx\"" Apr 22 17:34:12.836743 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.836726 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:34:12.836943 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.836929 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:12.837050 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:12.837024 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:12.837229 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.837211 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:34:12.837302 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.837231 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-sfkpl\"" Apr 22 17:34:12.837486 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.837472 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:34:12.837976 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.837961 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:34:12.838300 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.838162 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-42f6v\"" Apr 22 17:34:12.838300 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.838167 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:34:12.838300 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.838190 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:34:12.838300 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.838226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:12.838300 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.838190 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:34:12.838300 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:12.838278 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:12.839890 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.839874 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:12.841320 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.841302 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.841409 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.841387 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:12.841703 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.841681 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:12.842478 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.842434 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:34:12.842581 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.842499 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:12.842581 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.842504 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xrv6x\"" Apr 22 17:34:12.842581 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.842525 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:12.842728 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.842714 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.843564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.843531 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x6759\"" Apr 22 17:34:12.844029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.844008 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:34:12.844121 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.844090 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:34:12.844301 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.844284 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:12.844927 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.844907 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:34:12.845378 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.845147 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:34:12.845378 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.845171 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:34:12.845378 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.845244 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:34:12.846159 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.845971 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t8jx7\"" Apr 22 17:34:12.846159 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.846043 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:34:12.846329 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.846300 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:34:12.847007 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.846457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:12.847314 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4lvw\" (UniqueName: \"kubernetes.io/projected/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-kube-api-access-f4lvw\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:12.847404 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847328 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2rxl\" (UniqueName: \"kubernetes.io/projected/26bfba8d-71c2-4440-8bc9-7e5759e58f9f-kube-api-access-p2rxl\") pod \"node-ca-hspjm\" (UID: \"26bfba8d-71c2-4440-8bc9-7e5759e58f9f\") " pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:12.847404 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-run-systemd\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.847404 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-ovn-node-metrics-cert\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.847626 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-cnibin\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.847626 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-daemon-config\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.847720 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847631 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-modprobe-d\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.847720 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847653 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-sysctl-d\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.847805 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-ovnkube-script-lib\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.847805 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-var-lib-cni-bin\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.847805 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-etc-kubernetes\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.848356 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848336 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:34:12.848356 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.847872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-device-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.848570 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848550 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:34:12.848650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848583 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-59wpn\"" Apr 22 17:34:12.848801 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848783 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-run\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.848865 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848820 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:34:12.848921 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66574196-7d35-49dd-9f6e-038a6df72a46-tmp\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.848921 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848897 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/25234a3f-5bb9-424d-9c97-e481fea97fb1-iptables-alerter-script\") pod \"iptables-alerter-wpwkz\" (UID: \"25234a3f-5bb9-424d-9c97-e481fea97fb1\") " pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:12.849023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-log-socket\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.849023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-cni-dir\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.849023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848961 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-etc-openvswitch\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.849023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.848986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-cni-netd\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.849214 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849034 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-env-overrides\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.849214 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4-tmp-dir\") pod \"node-resolver-cznmj\" (UID: \"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4\") " pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:12.849214 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-run-netns\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.849214 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6f56fd06-5b7e-4a95-b2ba-3604f018b678-agent-certs\") pod \"konnectivity-agent-jb9vl\" (UID: \"6f56fd06-5b7e-4a95-b2ba-3604f018b678\") " pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:12.849369 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnc8\" (UniqueName: \"kubernetes.io/projected/25234a3f-5bb9-424d-9c97-e481fea97fb1-kube-api-access-wtnc8\") pod \"iptables-alerter-wpwkz\" (UID: \"25234a3f-5bb9-424d-9c97-e481fea97fb1\") " pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:12.849369 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849255 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-kubelet\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.849631 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849597 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:34:12.849717 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849631 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5bg6\" (UniqueName: \"kubernetes.io/projected/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-kube-api-access-g5bg6\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.849717 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849663 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-sysctl-conf\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.849717 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849693 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-sys\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849717 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/66574196-7d35-49dd-9f6e-038a6df72a46-etc-tuned\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849741 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr8cv\" (UniqueName: \"kubernetes.io/projected/66574196-7d35-49dd-9f6e-038a6df72a46-kube-api-access-xr8cv\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-node-log\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-systemd\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849828 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ds5g\" (UniqueName: \"kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g\") pod \"network-check-target-69d82\" (UID: \"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6\") " pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849860 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25234a3f-5bb9-424d-9c97-e481fea97fb1-host-slash\") pod \"iptables-alerter-wpwkz\" (UID: \"25234a3f-5bb9-424d-9c97-e481fea97fb1\") " pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849868 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849873 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8blbk\"" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849879 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6fs2d\"" Apr 22 17:34:12.849905 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849887 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-socket-dir-parent\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849911 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-etc-selinux\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-sys-fs\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849956 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpzhl\" (UniqueName: \"kubernetes.io/projected/5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4-kube-api-access-qpzhl\") pod \"node-resolver-cznmj\" (UID: \"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4\") " pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.849996 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850019 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-ovnkube-config\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850023 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c5f2f61-7b2a-45e4-b641-854237b19df4-cni-binary-copy\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-sysconfig\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/26bfba8d-71c2-4440-8bc9-7e5759e58f9f-serviceca\") pod \"node-ca-hspjm\" (UID: \"26bfba8d-71c2-4440-8bc9-7e5759e58f9f\") " pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850120 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-run-netns\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850144 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-var-lib-kubelet\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4-hosts-file\") pod \"node-resolver-cznmj\" (UID: \"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4\") " pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850213 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-kubernetes\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850235 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-cni-bin\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnv7x\" (UniqueName: \"kubernetes.io/projected/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-kube-api-access-dnv7x\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.850466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850274 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-os-release\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850296 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx6fh\" (UniqueName: \"kubernetes.io/projected/6c5f2f61-7b2a-45e4-b641-854237b19df4-kube-api-access-zx6fh\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850328 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-host\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850344 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-var-lib-openvswitch\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-run-ovn\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850407 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-run-ovn-kubernetes\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850457 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-system-cni-dir\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850505 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-hostroot\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850545 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-conf-dir\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-var-lib-kubelet\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850633 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6f56fd06-5b7e-4a95-b2ba-3604f018b678-konnectivity-ca\") pod \"konnectivity-agent-jb9vl\" (UID: \"6f56fd06-5b7e-4a95-b2ba-3604f018b678\") " pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-registration-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26bfba8d-71c2-4440-8bc9-7e5759e58f9f-host\") pod \"node-ca-hspjm\" (UID: \"26bfba8d-71c2-4440-8bc9-7e5759e58f9f\") " pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-run-openvswitch\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-run-multus-certs\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-socket-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.851128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-lib-modules\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.851668 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850855 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-run-k8s-cni-cncf-io\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.851668 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-systemd-units\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.851668 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850901 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-slash\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.851668 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.850931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-var-lib-cni-multus\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.853861 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.853834 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:12.863447 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:12.863407 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1346330c188515bccf6ba5d0ec452e1d.slice/crio-bfeff12ceefa57ac3aceb909888f552dd4eac829f5d2317bf0c843d3e5bb940c WatchSource:0}: Error finding container bfeff12ceefa57ac3aceb909888f552dd4eac829f5d2317bf0c843d3e5bb940c: Status 404 returned error can't find the container with id bfeff12ceefa57ac3aceb909888f552dd4eac829f5d2317bf0c843d3e5bb940c Apr 22 17:34:12.863693 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:12.863668 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76b662887d5a0620f946a7b7cdcbb27a.slice/crio-bf1e1120da067e81a48db35cab5e8ecdefb44594129a1e9481f8790340c4691e WatchSource:0}: Error finding container bf1e1120da067e81a48db35cab5e8ecdefb44594129a1e9481f8790340c4691e: Status 404 returned error can't find the container with id bf1e1120da067e81a48db35cab5e8ecdefb44594129a1e9481f8790340c4691e Apr 22 17:34:12.868202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.868186 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:34:12.871825 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.871809 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vfzw2" Apr 22 17:34:12.879735 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.879717 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vfzw2" Apr 22 17:34:12.944061 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.944034 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:34:12.951815 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.951777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5bg6\" (UniqueName: \"kubernetes.io/projected/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-kube-api-access-g5bg6\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.951924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.951827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-sysctl-conf\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.951924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.951853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-sys\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.951924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.951875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/66574196-7d35-49dd-9f6e-038a6df72a46-etc-tuned\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.952077 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952010 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-sys\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.952077 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952023 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-sysctl-conf\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.952175 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952096 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr8cv\" (UniqueName: \"kubernetes.io/projected/66574196-7d35-49dd-9f6e-038a6df72a46-kube-api-access-xr8cv\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.952175 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-node-log\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.952175 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-systemd\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.952175 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ds5g\" (UniqueName: \"kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g\") pod \"network-check-target-69d82\" (UID: \"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6\") " pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:12.952362 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25234a3f-5bb9-424d-9c97-e481fea97fb1-host-slash\") pod \"iptables-alerter-wpwkz\" (UID: \"25234a3f-5bb9-424d-9c97-e481fea97fb1\") " pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:12.952362 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952210 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-systemd\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.952362 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952221 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-socket-dir-parent\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.952362 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952250 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-node-log\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.952362 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952271 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-etc-selinux\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.952362 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952273 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-socket-dir-parent\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.952362 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-sys-fs\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.952362 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-sys-fs\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-etc-selinux\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952167 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952405 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25234a3f-5bb9-424d-9c97-e481fea97fb1-host-slash\") pod \"iptables-alerter-wpwkz\" (UID: \"25234a3f-5bb9-424d-9c97-e481fea97fb1\") " pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpzhl\" (UniqueName: \"kubernetes.io/projected/5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4-kube-api-access-qpzhl\") pod \"node-resolver-cznmj\" (UID: \"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4\") " pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-ovnkube-config\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c5f2f61-7b2a-45e4-b641-854237b19df4-cni-binary-copy\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-sysconfig\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/26bfba8d-71c2-4440-8bc9-7e5759e58f9f-serviceca\") pod \"node-ca-hspjm\" (UID: \"26bfba8d-71c2-4440-8bc9-7e5759e58f9f\") " pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952579 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-run-netns\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-var-lib-kubelet\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4-hosts-file\") pod \"node-resolver-cznmj\" (UID: \"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4\") " pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-kubernetes\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-cni-bin\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnv7x\" (UniqueName: \"kubernetes.io/projected/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-kube-api-access-dnv7x\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-os-release\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.952768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx6fh\" (UniqueName: \"kubernetes.io/projected/6c5f2f61-7b2a-45e4-b641-854237b19df4-kube-api-access-zx6fh\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-host\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-var-lib-openvswitch\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-run-ovn\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952827 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-run-ovn-kubernetes\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952876 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-system-cni-dir\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-kubernetes\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-hostroot\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-conf-dir\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4-hosts-file\") pod \"node-resolver-cznmj\" (UID: \"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4\") " pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.952972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-var-lib-kubelet\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953031 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-os-release\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953035 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-var-lib-kubelet\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953061 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953083 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-run-ovn-kubernetes\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6f56fd06-5b7e-4a95-b2ba-3604f018b678-konnectivity-ca\") pod \"konnectivity-agent-jb9vl\" (UID: \"6f56fd06-5b7e-4a95-b2ba-3604f018b678\") " pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-registration-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.953593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26bfba8d-71c2-4440-8bc9-7e5759e58f9f-host\") pod \"node-ca-hspjm\" (UID: \"26bfba8d-71c2-4440-8bc9-7e5759e58f9f\") " pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953138 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-system-cni-dir\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953166 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-cni-bin\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-run-openvswitch\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953180 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-hostroot\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-conf-dir\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953230 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-run-multus-certs\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953235 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c5f2f61-7b2a-45e4-b641-854237b19df4-cni-binary-copy\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-socket-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953267 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-host\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-lib-modules\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-run-k8s-cni-cncf-io\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953339 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-var-lib-openvswitch\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953368 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-run-netns\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953392 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-os-release\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953450 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-registration-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-systemd-units\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.954360 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:12.953472 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-slash\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953491 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-systemd-units\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-var-lib-cni-multus\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953545 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-sysconfig\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953539 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-ovnkube-config\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26bfba8d-71c2-4440-8bc9-7e5759e58f9f-host\") pod \"node-ca-hspjm\" (UID: \"26bfba8d-71c2-4440-8bc9-7e5759e58f9f\") " pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-run-multus-certs\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:12.953558 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs podName:81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:13.453512 +0000 UTC m=+2.069604270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs") pod "network-metrics-daemon-fd4w6" (UID: "81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953611 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-run-ovn\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953631 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-run-openvswitch\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-var-lib-cni-multus\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-slash\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953486 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-var-lib-kubelet\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-socket-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953705 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4lvw\" (UniqueName: \"kubernetes.io/projected/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-kube-api-access-f4lvw\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/26bfba8d-71c2-4440-8bc9-7e5759e58f9f-serviceca\") pod \"node-ca-hspjm\" (UID: \"26bfba8d-71c2-4440-8bc9-7e5759e58f9f\") " pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:12.955202 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953729 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-run-k8s-cni-cncf-io\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2rxl\" (UniqueName: \"kubernetes.io/projected/26bfba8d-71c2-4440-8bc9-7e5759e58f9f-kube-api-access-p2rxl\") pod \"node-ca-hspjm\" (UID: \"26bfba8d-71c2-4440-8bc9-7e5759e58f9f\") " pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953794 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-run-systemd\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953819 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-ovn-node-metrics-cert\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953845 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-lib-modules\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-cnibin\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-daemon-config\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953946 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9db6f1f0-3473-4256-b588-7220e67210f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953970 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-cnibin\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953975 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.953895 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-run-systemd\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-modprobe-d\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6f56fd06-5b7e-4a95-b2ba-3604f018b678-konnectivity-ca\") pod \"konnectivity-agent-jb9vl\" (UID: \"6f56fd06-5b7e-4a95-b2ba-3604f018b678\") " pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954133 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-sysctl-d\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-modprobe-d\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-ovnkube-script-lib\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-var-lib-cni-bin\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.955924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-etc-kubernetes\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954270 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-cnibin\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-etc-sysctl-d\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954295 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-device-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-run\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954328 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-etc-kubernetes\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66574196-7d35-49dd-9f6e-038a6df72a46-tmp\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-var-lib-cni-bin\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954365 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/25234a3f-5bb9-424d-9c97-e481fea97fb1-iptables-alerter-script\") pod \"iptables-alerter-wpwkz\" (UID: \"25234a3f-5bb9-424d-9c97-e481fea97fb1\") " pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-device-dir\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-log-socket\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/66574196-7d35-49dd-9f6e-038a6df72a46-run\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-cni-dir\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954530 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-system-cni-dir\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-log-socket\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-etc-openvswitch\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954575 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal" event={"ID":"1346330c188515bccf6ba5d0ec452e1d","Type":"ContainerStarted","Data":"bfeff12ceefa57ac3aceb909888f552dd4eac829f5d2317bf0c843d3e5bb940c"} Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-cni-netd\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.956546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-cni-dir\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954731 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-etc-openvswitch\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954769 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-cni-netd\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-ovnkube-script-lib\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-env-overrides\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954869 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9db6f1f0-3473-4256-b588-7220e67210f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9db6f1f0-3473-4256-b588-7220e67210f2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.954951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6c5f2f61-7b2a-45e4-b641-854237b19df4-multus-daemon-config\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955191 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4ptb\" (UniqueName: \"kubernetes.io/projected/9db6f1f0-3473-4256-b588-7220e67210f2-kube-api-access-n4ptb\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/25234a3f-5bb9-424d-9c97-e481fea97fb1-iptables-alerter-script\") pod \"iptables-alerter-wpwkz\" (UID: \"25234a3f-5bb9-424d-9c97-e481fea97fb1\") " pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4-tmp-dir\") pod \"node-resolver-cznmj\" (UID: \"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4\") " pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955275 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-run-netns\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6f56fd06-5b7e-4a95-b2ba-3604f018b678-agent-certs\") pod \"konnectivity-agent-jb9vl\" (UID: \"6f56fd06-5b7e-4a95-b2ba-3604f018b678\") " pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-env-overrides\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnc8\" (UniqueName: \"kubernetes.io/projected/25234a3f-5bb9-424d-9c97-e481fea97fb1-kube-api-access-wtnc8\") pod \"iptables-alerter-wpwkz\" (UID: \"25234a3f-5bb9-424d-9c97-e481fea97fb1\") " pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c5f2f61-7b2a-45e4-b641-854237b19df4-host-run-netns\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-kubelet\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.957029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955515 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4-tmp-dir\") pod \"node-resolver-cznmj\" (UID: \"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4\") " pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:12.957521 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955596 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-host-kubelet\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.957521 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.955966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/66574196-7d35-49dd-9f6e-038a6df72a46-etc-tuned\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.957521 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.956153 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" event={"ID":"76b662887d5a0620f946a7b7cdcbb27a","Type":"ContainerStarted","Data":"bf1e1120da067e81a48db35cab5e8ecdefb44594129a1e9481f8790340c4691e"} Apr 22 17:34:12.957521 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.956609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-ovn-node-metrics-cert\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.957521 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.956715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66574196-7d35-49dd-9f6e-038a6df72a46-tmp\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.957950 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.957935 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6f56fd06-5b7e-4a95-b2ba-3604f018b678-agent-certs\") pod \"konnectivity-agent-jb9vl\" (UID: \"6f56fd06-5b7e-4a95-b2ba-3604f018b678\") " pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:12.958980 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:12.958965 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:12.959021 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:12.958983 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:12.959021 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:12.958992 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6ds5g for pod openshift-network-diagnostics/network-check-target-69d82: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:12.959076 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:12.959040 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g podName:8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:13.45902909 +0000 UTC m=+2.075121341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6ds5g" (UniqueName: "kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g") pod "network-check-target-69d82" (UID: "8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:12.961431 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.961392 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpzhl\" (UniqueName: \"kubernetes.io/projected/5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4-kube-api-access-qpzhl\") pod \"node-resolver-cznmj\" (UID: \"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4\") " pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:12.961552 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.961535 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr8cv\" (UniqueName: \"kubernetes.io/projected/66574196-7d35-49dd-9f6e-038a6df72a46-kube-api-access-xr8cv\") pod \"tuned-6mrgr\" (UID: \"66574196-7d35-49dd-9f6e-038a6df72a46\") " pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:12.962454 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.962415 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx6fh\" (UniqueName: \"kubernetes.io/projected/6c5f2f61-7b2a-45e4-b641-854237b19df4-kube-api-access-zx6fh\") pod \"multus-tg2s7\" (UID: \"6c5f2f61-7b2a-45e4-b641-854237b19df4\") " pod="openshift-multus/multus-tg2s7" Apr 22 17:34:12.962596 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.962455 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5bg6\" (UniqueName: \"kubernetes.io/projected/fdae7c8c-332a-4c29-a187-6ca9d4968e2b-kube-api-access-g5bg6\") pod \"aws-ebs-csi-driver-node-prgnx\" (UID: \"fdae7c8c-332a-4c29-a187-6ca9d4968e2b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:12.962797 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.962780 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnv7x\" (UniqueName: \"kubernetes.io/projected/3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307-kube-api-access-dnv7x\") pod \"ovnkube-node-2b8mb\" (UID: \"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:12.963918 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.963903 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2rxl\" (UniqueName: \"kubernetes.io/projected/26bfba8d-71c2-4440-8bc9-7e5759e58f9f-kube-api-access-p2rxl\") pod \"node-ca-hspjm\" (UID: \"26bfba8d-71c2-4440-8bc9-7e5759e58f9f\") " pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:12.964177 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.964157 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4lvw\" (UniqueName: \"kubernetes.io/projected/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-kube-api-access-f4lvw\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:12.964501 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:12.964486 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnc8\" (UniqueName: \"kubernetes.io/projected/25234a3f-5bb9-424d-9c97-e481fea97fb1-kube-api-access-wtnc8\") pod \"iptables-alerter-wpwkz\" (UID: \"25234a3f-5bb9-424d-9c97-e481fea97fb1\") " pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:13.056409 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-os-release\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056409 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9db6f1f0-3473-4256-b588-7220e67210f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056409 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056409 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-cnibin\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056723 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-system-cni-dir\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056723 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-system-cni-dir\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056723 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056507 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-os-release\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056723 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056518 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-cnibin\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056723 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056506 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9db6f1f0-3473-4256-b588-7220e67210f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056723 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9db6f1f0-3473-4256-b588-7220e67210f2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056723 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9db6f1f0-3473-4256-b588-7220e67210f2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056723 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4ptb\" (UniqueName: \"kubernetes.io/projected/9db6f1f0-3473-4256-b588-7220e67210f2-kube-api-access-n4ptb\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.056989 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9db6f1f0-3473-4256-b588-7220e67210f2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.057025 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.056995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9db6f1f0-3473-4256-b588-7220e67210f2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.057411 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.057396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9db6f1f0-3473-4256-b588-7220e67210f2-cni-binary-copy\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.066079 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.066059 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4ptb\" (UniqueName: \"kubernetes.io/projected/9db6f1f0-3473-4256-b588-7220e67210f2-kube-api-access-n4ptb\") pod \"multus-additional-cni-plugins-z45jm\" (UID: \"9db6f1f0-3473-4256-b588-7220e67210f2\") " pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.170220 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.170184 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" Apr 22 17:34:13.175887 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.175862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hspjm" Apr 22 17:34:13.176519 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:13.176454 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66574196_7d35_49dd_9f6e_038a6df72a46.slice/crio-e4e33468e58dc5139b491283f9fde082e83b33893bf8a3801fb10710f31eaa82 WatchSource:0}: Error finding container e4e33468e58dc5139b491283f9fde082e83b33893bf8a3801fb10710f31eaa82: Status 404 returned error can't find the container with id e4e33468e58dc5139b491283f9fde082e83b33893bf8a3801fb10710f31eaa82 Apr 22 17:34:13.183521 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.183495 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tg2s7" Apr 22 17:34:13.185511 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:13.185482 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26bfba8d_71c2_4440_8bc9_7e5759e58f9f.slice/crio-ba1472046a2a883fabefee6091c3703def815bcee78cbed88b43650d83fca101 WatchSource:0}: Error finding container ba1472046a2a883fabefee6091c3703def815bcee78cbed88b43650d83fca101: Status 404 returned error can't find the container with id ba1472046a2a883fabefee6091c3703def815bcee78cbed88b43650d83fca101 Apr 22 17:34:13.187433 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.187400 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wpwkz" Apr 22 17:34:13.190756 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:13.190732 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c5f2f61_7b2a_45e4_b641_854237b19df4.slice/crio-d7e85790ae5c501deb972223c953136969785e372dca6e5077e11b17786160c0 WatchSource:0}: Error finding container d7e85790ae5c501deb972223c953136969785e372dca6e5077e11b17786160c0: Status 404 returned error can't find the container with id d7e85790ae5c501deb972223c953136969785e372dca6e5077e11b17786160c0 Apr 22 17:34:13.191986 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.191932 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:13.196136 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:13.196114 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25234a3f_5bb9_424d_9c97_e481fea97fb1.slice/crio-8c550ede7caf8204f1c3d61a35c5579e26a581f55a5822582ce7020572ffbf0b WatchSource:0}: Error finding container 8c550ede7caf8204f1c3d61a35c5579e26a581f55a5822582ce7020572ffbf0b: Status 404 returned error can't find the container with id 8c550ede7caf8204f1c3d61a35c5579e26a581f55a5822582ce7020572ffbf0b Apr 22 17:34:13.197349 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.197331 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:13.201018 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:13.200838 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ca8f0dc_e6b3_42f9_bbfe_f1e0f1d31307.slice/crio-11dc32a44add6ce286a47e57492b930c2f8a28af40a5f8f99dce9cc3a4d0dafd WatchSource:0}: Error finding container 11dc32a44add6ce286a47e57492b930c2f8a28af40a5f8f99dce9cc3a4d0dafd: Status 404 returned error can't find the container with id 11dc32a44add6ce286a47e57492b930c2f8a28af40a5f8f99dce9cc3a4d0dafd Apr 22 17:34:13.202689 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.202662 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" Apr 22 17:34:13.207981 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:13.207960 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f56fd06_5b7e_4a95_b2ba_3604f018b678.slice/crio-94afab72bff5f2cb6bba1be5adae1d9b74f8e232049e2d8a02e77513743d921e WatchSource:0}: Error finding container 94afab72bff5f2cb6bba1be5adae1d9b74f8e232049e2d8a02e77513743d921e: Status 404 returned error can't find the container with id 94afab72bff5f2cb6bba1be5adae1d9b74f8e232049e2d8a02e77513743d921e Apr 22 17:34:13.208834 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.208803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cznmj" Apr 22 17:34:13.212801 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:13.212774 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdae7c8c_332a_4c29_a187_6ca9d4968e2b.slice/crio-5e5214acad8629cce7e2f8439f78040def28c152b444d0529fa4a391cc239cfe WatchSource:0}: Error finding container 5e5214acad8629cce7e2f8439f78040def28c152b444d0529fa4a391cc239cfe: Status 404 returned error can't find the container with id 5e5214acad8629cce7e2f8439f78040def28c152b444d0529fa4a391cc239cfe Apr 22 17:34:13.212922 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.212855 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z45jm" Apr 22 17:34:13.220122 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:13.220080 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b0d2e46_ad7d_48ff_9531_6e09d8c07bc4.slice/crio-99a7541b8e4725eed968c25f6567a5a57f4daeac676f86032623e56f5423e0b6 WatchSource:0}: Error finding container 99a7541b8e4725eed968c25f6567a5a57f4daeac676f86032623e56f5423e0b6: Status 404 returned error can't find the container with id 99a7541b8e4725eed968c25f6567a5a57f4daeac676f86032623e56f5423e0b6 Apr 22 17:34:13.223976 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:13.223953 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9db6f1f0_3473_4256_b588_7220e67210f2.slice/crio-94f4988f8431c026a203ef6532adc0ca183bb3eb3f58568a8e7f16181c02da2c WatchSource:0}: Error finding container 94f4988f8431c026a203ef6532adc0ca183bb3eb3f58568a8e7f16181c02da2c: Status 404 returned error can't find the container with id 94f4988f8431c026a203ef6532adc0ca183bb3eb3f58568a8e7f16181c02da2c Apr 22 17:34:13.459230 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.459141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ds5g\" (UniqueName: \"kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g\") pod \"network-check-target-69d82\" (UID: \"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6\") " pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:13.459230 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.459198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:13.459445 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:13.459317 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:13.459445 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:13.459384 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs podName:81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:14.459363884 +0000 UTC m=+3.075456141 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs") pod "network-metrics-daemon-fd4w6" (UID: "81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:13.459785 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:13.459318 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:13.459909 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:13.459798 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:13.459909 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:13.459815 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6ds5g for pod openshift-network-diagnostics/network-check-target-69d82: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:13.459909 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:13.459873 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g podName:8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:14.459853163 +0000 UTC m=+3.075945427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ds5g" (UniqueName: "kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g") pod "network-check-target-69d82" (UID: "8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:13.585440 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.585393 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:13.668440 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.665068 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:13.880633 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.880433 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:12 +0000 UTC" deadline="2027-10-31 09:18:02.539792829 +0000 UTC" Apr 22 17:34:13.880633 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.880470 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13359h43m48.659327132s" Apr 22 17:34:13.953934 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.953302 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:13.953934 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:13.953462 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:13.975774 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.975733 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cznmj" event={"ID":"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4","Type":"ContainerStarted","Data":"99a7541b8e4725eed968c25f6567a5a57f4daeac676f86032623e56f5423e0b6"} Apr 22 17:34:13.980657 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.980624 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wpwkz" event={"ID":"25234a3f-5bb9-424d-9c97-e481fea97fb1","Type":"ContainerStarted","Data":"8c550ede7caf8204f1c3d61a35c5579e26a581f55a5822582ce7020572ffbf0b"} Apr 22 17:34:13.992275 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.992245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tg2s7" event={"ID":"6c5f2f61-7b2a-45e4-b641-854237b19df4","Type":"ContainerStarted","Data":"d7e85790ae5c501deb972223c953136969785e372dca6e5077e11b17786160c0"} Apr 22 17:34:13.995527 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:13.995498 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hspjm" event={"ID":"26bfba8d-71c2-4440-8bc9-7e5759e58f9f","Type":"ContainerStarted","Data":"ba1472046a2a883fabefee6091c3703def815bcee78cbed88b43650d83fca101"} Apr 22 17:34:14.005040 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.005009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" event={"ID":"66574196-7d35-49dd-9f6e-038a6df72a46","Type":"ContainerStarted","Data":"e4e33468e58dc5139b491283f9fde082e83b33893bf8a3801fb10710f31eaa82"} Apr 22 17:34:14.010576 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.010549 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z45jm" event={"ID":"9db6f1f0-3473-4256-b588-7220e67210f2","Type":"ContainerStarted","Data":"94f4988f8431c026a203ef6532adc0ca183bb3eb3f58568a8e7f16181c02da2c"} Apr 22 17:34:14.015490 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.015463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" event={"ID":"fdae7c8c-332a-4c29-a187-6ca9d4968e2b","Type":"ContainerStarted","Data":"5e5214acad8629cce7e2f8439f78040def28c152b444d0529fa4a391cc239cfe"} Apr 22 17:34:14.020333 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.020307 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jb9vl" event={"ID":"6f56fd06-5b7e-4a95-b2ba-3604f018b678","Type":"ContainerStarted","Data":"94afab72bff5f2cb6bba1be5adae1d9b74f8e232049e2d8a02e77513743d921e"} Apr 22 17:34:14.024550 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.024500 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" event={"ID":"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307","Type":"ContainerStarted","Data":"11dc32a44add6ce286a47e57492b930c2f8a28af40a5f8f99dce9cc3a4d0dafd"} Apr 22 17:34:14.273096 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.272988 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:14.468699 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.468660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:14.468900 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.468753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ds5g\" (UniqueName: \"kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g\") pod \"network-check-target-69d82\" (UID: \"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6\") " pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:14.468964 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:14.468912 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:14.468964 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:14.468929 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:14.468964 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:14.468943 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6ds5g for pod openshift-network-diagnostics/network-check-target-69d82: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:14.469124 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:14.469000 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g podName:8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:16.468982213 +0000 UTC m=+5.085074466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ds5g" (UniqueName: "kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g") pod "network-check-target-69d82" (UID: "8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:14.469504 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:14.469454 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:14.469598 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:14.469509 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs podName:81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:16.469493794 +0000 UTC m=+5.085586054 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs") pod "network-metrics-daemon-fd4w6" (UID: "81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:14.881246 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.881200 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:12 +0000 UTC" deadline="2028-01-05 17:24:50.744026252 +0000 UTC" Apr 22 17:34:14.881246 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.881242 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14951h50m35.86278875s" Apr 22 17:34:14.951723 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:14.951688 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:14.951919 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:14.951821 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:15.953109 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:15.953077 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:15.953547 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:15.953224 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:16.485820 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:16.485783 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:16.486021 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:16.485863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ds5g\" (UniqueName: \"kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g\") pod \"network-check-target-69d82\" (UID: \"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6\") " pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:16.486021 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:16.485940 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:16.486021 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:16.485985 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:16.486021 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:16.486002 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:16.486021 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:16.486007 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs podName:81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:20.485986664 +0000 UTC m=+9.102078927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs") pod "network-metrics-daemon-fd4w6" (UID: "81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:16.486021 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:16.486017 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6ds5g for pod openshift-network-diagnostics/network-check-target-69d82: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:16.486345 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:16.486048 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g podName:8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:20.486038981 +0000 UTC m=+9.102131234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ds5g" (UniqueName: "kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g") pod "network-check-target-69d82" (UID: "8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:16.951333 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:16.951243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:16.951519 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:16.951464 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:17.955069 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:17.955037 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:17.955567 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:17.955174 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:18.951691 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:18.951654 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:18.951880 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:18.951797 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:19.955415 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:19.955383 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:19.955874 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:19.955544 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:20.522558 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:20.522309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:20.522558 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:20.522390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ds5g\" (UniqueName: \"kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g\") pod \"network-check-target-69d82\" (UID: \"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6\") " pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:20.522558 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:20.522479 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:20.522558 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:20.522535 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:20.522558 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:20.522548 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs podName:81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:28.522527802 +0000 UTC m=+17.138620055 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs") pod "network-metrics-daemon-fd4w6" (UID: "81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:20.522558 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:20.522553 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:20.522558 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:20.522567 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6ds5g for pod openshift-network-diagnostics/network-check-target-69d82: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:20.522962 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:20.522613 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g podName:8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:28.522600081 +0000 UTC m=+17.138692347 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ds5g" (UniqueName: "kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g") pod "network-check-target-69d82" (UID: "8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:20.951642 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:20.951562 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:20.951801 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:20.951763 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:21.956717 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:21.956685 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:21.957178 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:21.956817 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:22.950794 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:22.950753 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:22.950969 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:22.950923 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:23.954359 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:23.954326 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:23.954760 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:23.954462 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:24.950853 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:24.950821 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:24.951105 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:24.950936 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:25.951188 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:25.951154 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:25.951650 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:25.951283 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:26.951090 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:26.951054 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:26.951289 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:26.951183 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:27.333253 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.333216 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-s4xq4"] Apr 22 17:34:27.335662 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.335642 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:27.335766 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:27.335734 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s4xq4" podUID="8086e0dd-7b2c-4fb8-bb9c-e3554698418a" Apr 22 17:34:27.372762 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.372724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:27.372920 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.372809 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-dbus\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:27.372920 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.372857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-kubelet-config\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:27.474285 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.474246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-dbus\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:27.474474 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.474308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-kubelet-config\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:27.474474 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.474352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:27.474474 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.474455 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-dbus\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:27.474616 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:27.474486 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:27.474616 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.474456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-kubelet-config\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:27.474616 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:27.474546 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret podName:8086e0dd-7b2c-4fb8-bb9c-e3554698418a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:27.97452726 +0000 UTC m=+16.590619515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret") pod "global-pull-secret-syncer-s4xq4" (UID: "8086e0dd-7b2c-4fb8-bb9c-e3554698418a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:27.951885 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.951852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:27.952328 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:27.951981 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:27.977802 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:27.977765 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:27.977956 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:27.977935 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:27.978027 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:27.978015 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret podName:8086e0dd-7b2c-4fb8-bb9c-e3554698418a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:28.9779938 +0000 UTC m=+17.594086066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret") pod "global-pull-secret-syncer-s4xq4" (UID: "8086e0dd-7b2c-4fb8-bb9c-e3554698418a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:28.582525 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:28.582487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:28.582730 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:28.582578 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ds5g\" (UniqueName: \"kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g\") pod \"network-check-target-69d82\" (UID: \"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6\") " pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:28.582730 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:28.582640 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:28.582730 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:28.582716 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs podName:81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:44.582694383 +0000 UTC m=+33.198786633 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs") pod "network-metrics-daemon-fd4w6" (UID: "81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:28.582730 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:28.582720 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:28.582922 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:28.582738 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:28.582922 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:28.582751 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6ds5g for pod openshift-network-diagnostics/network-check-target-69d82: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:28.582922 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:28.582812 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g podName:8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:44.582795496 +0000 UTC m=+33.198887749 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ds5g" (UniqueName: "kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g") pod "network-check-target-69d82" (UID: "8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:28.951208 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:28.951127 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:28.951388 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:28.951128 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:28.951388 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:28.951266 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s4xq4" podUID="8086e0dd-7b2c-4fb8-bb9c-e3554698418a" Apr 22 17:34:28.951388 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:28.951344 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:28.985491 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:28.985456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:28.985980 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:28.985586 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:28.985980 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:28.985654 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret podName:8086e0dd-7b2c-4fb8-bb9c-e3554698418a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:30.985638559 +0000 UTC m=+19.601730817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret") pod "global-pull-secret-syncer-s4xq4" (UID: "8086e0dd-7b2c-4fb8-bb9c-e3554698418a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:29.951766 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:29.951729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:29.951997 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:29.951856 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:30.951227 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:30.951195 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:30.951703 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:30.951195 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:30.951703 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:30.951326 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:30.951703 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:30.951408 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s4xq4" podUID="8086e0dd-7b2c-4fb8-bb9c-e3554698418a" Apr 22 17:34:30.999925 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:30.999893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:31.000066 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:31.000045 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:31.000126 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:31.000113 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret podName:8086e0dd-7b2c-4fb8-bb9c-e3554698418a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:35.000098146 +0000 UTC m=+23.616190402 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret") pod "global-pull-secret-syncer-s4xq4" (UID: "8086e0dd-7b2c-4fb8-bb9c-e3554698418a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:31.069744 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.069630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tg2s7" event={"ID":"6c5f2f61-7b2a-45e4-b641-854237b19df4","Type":"ContainerStarted","Data":"0c24f49fbc36e6c055c80573b86ad3338d87f25f2c0dd904f19c9cfe13639e66"} Apr 22 17:34:31.071582 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.071538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" event={"ID":"66574196-7d35-49dd-9f6e-038a6df72a46","Type":"ContainerStarted","Data":"c88270479908116eca2effa563c98f1b4322476c0b5db31645ce9d776ea79773"} Apr 22 17:34:31.084555 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.084519 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal" event={"ID":"1346330c188515bccf6ba5d0ec452e1d","Type":"ContainerStarted","Data":"8782c429ca505f946591f87e24889ff4f22bdaa9f5804615fe6977f6c1e0eceb"} Apr 22 17:34:31.096761 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.096737 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:34:31.097162 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.097134 2578 generic.go:358] "Generic (PLEG): container finished" podID="3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307" containerID="fae8aee462199431dcd6c7fe238a52e578688765f8dd32fa69d72a1de41e3516" exitCode=1 Apr 22 17:34:31.097237 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.097182 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" event={"ID":"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307","Type":"ContainerStarted","Data":"761dd15e21c545dadda4ee1bf190190364bb7fa25af248ca2d33afc3164e42e2"} Apr 22 17:34:31.097237 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.097209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" event={"ID":"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307","Type":"ContainerStarted","Data":"9dede4eaca8c961d234386530d9521a83d2f73025f9ce74f4edc53793df4cdc6"} Apr 22 17:34:31.097237 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.097223 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" event={"ID":"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307","Type":"ContainerDied","Data":"fae8aee462199431dcd6c7fe238a52e578688765f8dd32fa69d72a1de41e3516"} Apr 22 17:34:31.097390 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.097241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" event={"ID":"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307","Type":"ContainerStarted","Data":"aae6536e6bacdad20099987c41fca1d04fe7024c70f018b778a96f732367f7db"} Apr 22 17:34:31.122886 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.122805 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tg2s7" podStartSLOduration=1.535223037 podStartE2EDuration="19.122784476s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:13.193073773 +0000 UTC m=+1.809166030" lastFinishedPulling="2026-04-22 17:34:30.780635208 +0000 UTC m=+19.396727469" observedRunningTime="2026-04-22 17:34:31.122056629 +0000 UTC m=+19.738148925" watchObservedRunningTime="2026-04-22 17:34:31.122784476 +0000 UTC m=+19.738876750" Apr 22 17:34:31.191910 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.191860 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6mrgr" podStartSLOduration=1.8248844640000002 podStartE2EDuration="19.19183917s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:13.180937591 +0000 UTC m=+1.797029844" lastFinishedPulling="2026-04-22 17:34:30.547892283 +0000 UTC m=+19.163984550" observedRunningTime="2026-04-22 17:34:31.153905854 +0000 UTC m=+19.769998125" watchObservedRunningTime="2026-04-22 17:34:31.19183917 +0000 UTC m=+19.807931443" Apr 22 17:34:31.192539 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.192479 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-54.ec2.internal" podStartSLOduration=19.192469805 podStartE2EDuration="19.192469805s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:31.191914746 +0000 UTC m=+19.808007020" watchObservedRunningTime="2026-04-22 17:34:31.192469805 +0000 UTC m=+19.808562080" Apr 22 17:34:31.951558 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:31.951379 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:31.952169 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:31.951638 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:32.100504 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.100466 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cznmj" event={"ID":"5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4","Type":"ContainerStarted","Data":"77fe281722ffbd531a350c11dc32370e1cc27381ff1b59781277f9c01cc787ba"} Apr 22 17:34:32.102391 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.102360 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wpwkz" event={"ID":"25234a3f-5bb9-424d-9c97-e481fea97fb1","Type":"ContainerStarted","Data":"e1eb2a247aaf2bbeee266337cddfa5d1be3fde962fc42a9dc168deebc5814e02"} Apr 22 17:34:32.103720 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.103691 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hspjm" event={"ID":"26bfba8d-71c2-4440-8bc9-7e5759e58f9f","Type":"ContainerStarted","Data":"4eec96fcbb8ca6fbf7d799be7102e11a63872fb0051cd4691f50457dfca7daa4"} Apr 22 17:34:32.105120 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.105095 2578 generic.go:358] "Generic (PLEG): container finished" podID="9db6f1f0-3473-4256-b588-7220e67210f2" containerID="daab2086ca736f1abd272f5977f6b0f0f8534bf664ae2dac481a1de6505dcce9" exitCode=0 Apr 22 17:34:32.105217 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.105171 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z45jm" event={"ID":"9db6f1f0-3473-4256-b588-7220e67210f2","Type":"ContainerDied","Data":"daab2086ca736f1abd272f5977f6b0f0f8534bf664ae2dac481a1de6505dcce9"} Apr 22 17:34:32.106607 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.106568 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" event={"ID":"fdae7c8c-332a-4c29-a187-6ca9d4968e2b","Type":"ContainerStarted","Data":"3a35945ab7ff552e24006307c9af702d51687a427b2a208e6d843abb43f2bd5e"} Apr 22 17:34:32.107985 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.107968 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jb9vl" event={"ID":"6f56fd06-5b7e-4a95-b2ba-3604f018b678","Type":"ContainerStarted","Data":"25ebf84d122c45c3886575094830c1fb88e1ae2006a8baec79f5d480b626373d"} Apr 22 17:34:32.110467 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.110451 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:34:32.110798 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.110778 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" event={"ID":"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307","Type":"ContainerStarted","Data":"80458b82d0bfd28004b2677ad600ae2bfc777b95f7388b9f2b1c998fd4877ae1"} Apr 22 17:34:32.110798 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.110800 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" event={"ID":"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307","Type":"ContainerStarted","Data":"f07b5fb2f2aea265541ba6be1fb356ba01d94782aa5ee1c0d6bde74df896a374"} Apr 22 17:34:32.112228 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.112208 2578 generic.go:358] "Generic (PLEG): container finished" podID="76b662887d5a0620f946a7b7cdcbb27a" containerID="868b32d69733c99ffeaa9a49ec2fe5f5d406be7f13b30d92e3d31ebde94cc50a" exitCode=0 Apr 22 17:34:32.112345 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.112309 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" event={"ID":"76b662887d5a0620f946a7b7cdcbb27a","Type":"ContainerDied","Data":"868b32d69733c99ffeaa9a49ec2fe5f5d406be7f13b30d92e3d31ebde94cc50a"} Apr 22 17:34:32.116479 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.116443 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cznmj" podStartSLOduration=2.606135974 podStartE2EDuration="20.116413523s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:13.221796115 +0000 UTC m=+1.837888378" lastFinishedPulling="2026-04-22 17:34:30.732073659 +0000 UTC m=+19.348165927" observedRunningTime="2026-04-22 17:34:32.115948011 +0000 UTC m=+20.732040283" watchObservedRunningTime="2026-04-22 17:34:32.116413523 +0000 UTC m=+20.732505795" Apr 22 17:34:32.129666 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.129631 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wpwkz" podStartSLOduration=2.781401506 podStartE2EDuration="20.129620791s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:13.198028376 +0000 UTC m=+1.814120650" lastFinishedPulling="2026-04-22 17:34:30.546247671 +0000 UTC m=+19.162339935" observedRunningTime="2026-04-22 17:34:32.129202155 +0000 UTC m=+20.745294431" watchObservedRunningTime="2026-04-22 17:34:32.129620791 +0000 UTC m=+20.745713104" Apr 22 17:34:32.156857 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.156818 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hspjm" podStartSLOduration=6.343349689 podStartE2EDuration="20.156807632s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:13.187600345 +0000 UTC m=+1.803692612" lastFinishedPulling="2026-04-22 17:34:27.001058273 +0000 UTC m=+15.617150555" observedRunningTime="2026-04-22 17:34:32.156719534 +0000 UTC m=+20.772811805" watchObservedRunningTime="2026-04-22 17:34:32.156807632 +0000 UTC m=+20.772899909" Apr 22 17:34:32.170868 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.170821 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jb9vl" podStartSLOduration=2.834774592 podStartE2EDuration="20.170809985s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:13.210060863 +0000 UTC m=+1.826153112" lastFinishedPulling="2026-04-22 17:34:30.546096241 +0000 UTC m=+19.162188505" observedRunningTime="2026-04-22 17:34:32.17029679 +0000 UTC m=+20.786389064" watchObservedRunningTime="2026-04-22 17:34:32.170809985 +0000 UTC m=+20.786902257" Apr 22 17:34:32.425549 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.425526 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:34:32.906838 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.906690 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:34:32.425543959Z","UUID":"f0b17880-11fa-42fe-954a-361b2fc47522","Handler":null,"Name":"","Endpoint":""} Apr 22 17:34:32.909223 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.909200 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:34:32.909223 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.909225 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:34:32.951489 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.951459 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:32.951661 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:32.951459 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:32.951661 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:32.951589 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:32.952055 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:32.951671 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s4xq4" podUID="8086e0dd-7b2c-4fb8-bb9c-e3554698418a" Apr 22 17:34:33.116590 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:33.116484 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" event={"ID":"76b662887d5a0620f946a7b7cdcbb27a","Type":"ContainerStarted","Data":"c29ae72ff012a1d5fe2ebe7b78c0107e97e99517ce0c909003373ea8cfb4b416"} Apr 22 17:34:33.118665 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:33.118376 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" event={"ID":"fdae7c8c-332a-4c29-a187-6ca9d4968e2b","Type":"ContainerStarted","Data":"6d30e26d0ce4b771d8ce900da7a3dcc5c74908b208d482b92667bf4b5256ebae"} Apr 22 17:34:33.132750 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:33.132702 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-54.ec2.internal" podStartSLOduration=21.132684928 podStartE2EDuration="21.132684928s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:33.13205248 +0000 UTC m=+21.748144788" watchObservedRunningTime="2026-04-22 17:34:33.132684928 +0000 UTC m=+21.748777201" Apr 22 17:34:33.951158 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:33.950968 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:33.951343 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:33.951286 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:34.122822 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:34.122780 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" event={"ID":"fdae7c8c-332a-4c29-a187-6ca9d4968e2b","Type":"ContainerStarted","Data":"a365c57fe8fe980b4c02117bfb1c99bf7d23b7fbdb9cf05021d13db761c78047"} Apr 22 17:34:34.125822 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:34.125798 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:34:34.126187 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:34.126154 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" event={"ID":"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307","Type":"ContainerStarted","Data":"eca42f7ada90868e6c578d0f09bc08c907d7313d5e9ac6a08b0e34c48312951c"} Apr 22 17:34:34.141591 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:34.141549 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prgnx" podStartSLOduration=2.1627167800000002 podStartE2EDuration="22.141532599s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:13.214858491 +0000 UTC m=+1.830950746" lastFinishedPulling="2026-04-22 17:34:33.193674307 +0000 UTC m=+21.809766565" observedRunningTime="2026-04-22 17:34:34.141246984 +0000 UTC m=+22.757339255" watchObservedRunningTime="2026-04-22 17:34:34.141532599 +0000 UTC m=+22.757624875" Apr 22 17:34:34.950963 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:34.950926 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:34.951186 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:34.950926 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:34.951186 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:34.951076 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:34.951186 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:34.951143 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s4xq4" podUID="8086e0dd-7b2c-4fb8-bb9c-e3554698418a" Apr 22 17:34:35.026294 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:35.026259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:35.026496 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:35.026446 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:35.026559 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:35.026525 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret podName:8086e0dd-7b2c-4fb8-bb9c-e3554698418a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:43.026502639 +0000 UTC m=+31.642594903 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret") pod "global-pull-secret-syncer-s4xq4" (UID: "8086e0dd-7b2c-4fb8-bb9c-e3554698418a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:35.951774 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:35.951736 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:35.952273 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:35.951882 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:36.787663 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:36.787465 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:36.788114 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:36.788095 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:36.951739 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:36.951707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:36.951907 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:36.951707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:36.951907 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:36.951808 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s4xq4" podUID="8086e0dd-7b2c-4fb8-bb9c-e3554698418a" Apr 22 17:34:36.951907 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:36.951874 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:37.134194 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:37.134108 2578 generic.go:358] "Generic (PLEG): container finished" podID="9db6f1f0-3473-4256-b588-7220e67210f2" containerID="a0d92521de881de38ca0a7dcdc3884a8bf3ffa582f7e0775b53218ae289bbd3c" exitCode=0 Apr 22 17:34:37.134194 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:37.134179 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z45jm" event={"ID":"9db6f1f0-3473-4256-b588-7220e67210f2","Type":"ContainerDied","Data":"a0d92521de881de38ca0a7dcdc3884a8bf3ffa582f7e0775b53218ae289bbd3c"} Apr 22 17:34:37.137195 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:37.137177 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:34:37.137560 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:37.137539 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" event={"ID":"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307","Type":"ContainerStarted","Data":"677d260d2f5ceaf8088f95aca5efc31b97e1d23730388bd29cde6e7a4bf26d62"} Apr 22 17:34:37.137958 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:37.137943 2578 scope.go:117] "RemoveContainer" containerID="fae8aee462199431dcd6c7fe238a52e578688765f8dd32fa69d72a1de41e3516" Apr 22 17:34:37.951319 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:37.951291 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:37.951490 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:37.951410 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:38.145456 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.143519 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:34:38.146732 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.146675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" event={"ID":"3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307","Type":"ContainerStarted","Data":"de67f6052d1fb806cb843590a4c275e07d6d52bdeda99c179d36edeea0f7215f"} Apr 22 17:34:38.147279 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.147257 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:34:38.147624 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.147556 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:38.147624 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.147603 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:38.164624 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.164603 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:38.164698 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.164668 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:38.200174 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.200127 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" podStartSLOduration=8.83817926 podStartE2EDuration="26.200113941s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:13.203245277 +0000 UTC m=+1.819337532" lastFinishedPulling="2026-04-22 17:34:30.565179955 +0000 UTC m=+19.181272213" observedRunningTime="2026-04-22 17:34:38.199757824 +0000 UTC m=+26.815850096" watchObservedRunningTime="2026-04-22 17:34:38.200113941 +0000 UTC m=+26.816206213" Apr 22 17:34:38.378892 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.378863 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-69d82"] Apr 22 17:34:38.379032 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.379010 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:38.379146 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:38.379121 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:38.382578 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.382403 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s4xq4"] Apr 22 17:34:38.382827 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.382806 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:38.382930 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:38.382913 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s4xq4" podUID="8086e0dd-7b2c-4fb8-bb9c-e3554698418a" Apr 22 17:34:38.383138 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.383121 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fd4w6"] Apr 22 17:34:38.383246 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:38.383232 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:38.383367 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:38.383341 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:39.149976 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:39.149935 2578 generic.go:358] "Generic (PLEG): container finished" podID="9db6f1f0-3473-4256-b588-7220e67210f2" containerID="172a5251c199b8e8080b1f77c8bd81d8471d137a4779f2e9d2b326d6b7faf65a" exitCode=0 Apr 22 17:34:39.150444 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:39.150026 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z45jm" event={"ID":"9db6f1f0-3473-4256-b588-7220e67210f2","Type":"ContainerDied","Data":"172a5251c199b8e8080b1f77c8bd81d8471d137a4779f2e9d2b326d6b7faf65a"} Apr 22 17:34:39.150444 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:39.150273 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:34:39.950864 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:39.950801 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:39.950864 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:39.950838 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:39.951033 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:39.950936 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:39.951033 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:39.950978 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:39.951137 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:39.951056 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s4xq4" podUID="8086e0dd-7b2c-4fb8-bb9c-e3554698418a" Apr 22 17:34:39.951303 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:39.951266 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:40.153622 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:40.153590 2578 generic.go:358] "Generic (PLEG): container finished" podID="9db6f1f0-3473-4256-b588-7220e67210f2" containerID="f826057ed660cbc7a23d08c8ad0ff67947bf278ad3c101cc104c7dbfebfe707e" exitCode=0 Apr 22 17:34:40.154066 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:40.153679 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z45jm" event={"ID":"9db6f1f0-3473-4256-b588-7220e67210f2","Type":"ContainerDied","Data":"f826057ed660cbc7a23d08c8ad0ff67947bf278ad3c101cc104c7dbfebfe707e"} Apr 22 17:34:40.154066 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:40.153839 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:34:40.613621 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:40.613585 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:34:40.859735 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:40.859703 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:40.859914 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:40.859863 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:34:40.860375 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:40.860346 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jb9vl" Apr 22 17:34:41.173798 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:41.173735 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" podUID="3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 17:34:41.952402 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:41.952316 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:41.952402 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:41.952388 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:41.952402 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:41.952387 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:41.952689 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:41.952477 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69d82" podUID="8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6" Apr 22 17:34:41.952689 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:41.952590 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s4xq4" podUID="8086e0dd-7b2c-4fb8-bb9c-e3554698418a" Apr 22 17:34:41.952689 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:41.952661 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:34:43.086292 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.086252 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:43.086805 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:43.086417 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:43.086805 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:43.086510 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret podName:8086e0dd-7b2c-4fb8-bb9c-e3554698418a nodeName:}" failed. No retries permitted until 2026-04-22 17:34:59.08648671 +0000 UTC m=+47.702578993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret") pod "global-pull-secret-syncer-s4xq4" (UID: "8086e0dd-7b2c-4fb8-bb9c-e3554698418a") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:43.701190 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.701109 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-54.ec2.internal" event="NodeReady" Apr 22 17:34:43.701356 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.701286 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:34:43.767086 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.767053 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cbffr"] Apr 22 17:34:43.805435 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.805378 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jnl2w"] Apr 22 17:34:43.805602 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.805516 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:43.808548 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.808524 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dsn9k\"" Apr 22 17:34:43.808756 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.808722 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:34:43.808843 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.808727 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:34:43.808843 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.808758 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:34:43.826573 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.826540 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cbffr"] Apr 22 17:34:43.826573 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.826568 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jnl2w"] Apr 22 17:34:43.826699 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.826679 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:43.830344 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.830323 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7g4b5\"" Apr 22 17:34:43.830474 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.830375 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:34:43.830554 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.830479 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:34:43.892039 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.892008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:43.892039 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.892041 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxh7w\" (UniqueName: \"kubernetes.io/projected/06b47e11-143b-4f36-b4a4-c16daaed8856-kube-api-access-rxh7w\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:43.892252 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.892079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slfqh\" (UniqueName: \"kubernetes.io/projected/03e17850-8d7a-4344-ad3f-eeff8ff1097d-kube-api-access-slfqh\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:43.892252 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.892173 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03e17850-8d7a-4344-ad3f-eeff8ff1097d-config-volume\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:43.892252 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.892216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03e17850-8d7a-4344-ad3f-eeff8ff1097d-tmp-dir\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:43.892398 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.892265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:43.951637 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.951564 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:43.951637 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.951628 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:43.951853 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.951828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:43.955052 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.954878 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kkblr\"" Apr 22 17:34:43.955052 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.954883 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:34:43.955052 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.954927 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:34:43.955052 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.954945 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:34:43.955052 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.954927 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:34:43.955451 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.955413 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lgj26\"" Apr 22 17:34:43.993441 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.993393 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:43.993568 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.993450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxh7w\" (UniqueName: \"kubernetes.io/projected/06b47e11-143b-4f36-b4a4-c16daaed8856-kube-api-access-rxh7w\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:43.993568 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.993484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slfqh\" (UniqueName: \"kubernetes.io/projected/03e17850-8d7a-4344-ad3f-eeff8ff1097d-kube-api-access-slfqh\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:43.993568 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.993514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03e17850-8d7a-4344-ad3f-eeff8ff1097d-config-volume\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:43.993568 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:43.993544 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:43.993568 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.993551 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03e17850-8d7a-4344-ad3f-eeff8ff1097d-tmp-dir\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:43.993769 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.993590 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:43.993769 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:43.993610 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert podName:06b47e11-143b-4f36-b4a4-c16daaed8856 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:44.493589917 +0000 UTC m=+33.109682194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert") pod "ingress-canary-cbffr" (UID: "06b47e11-143b-4f36-b4a4-c16daaed8856") : secret "canary-serving-cert" not found Apr 22 17:34:43.993769 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:43.993655 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:43.993769 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:43.993697 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls podName:03e17850-8d7a-4344-ad3f-eeff8ff1097d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:44.493683131 +0000 UTC m=+33.109775394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls") pod "dns-default-jnl2w" (UID: "03e17850-8d7a-4344-ad3f-eeff8ff1097d") : secret "dns-default-metrics-tls" not found Apr 22 17:34:43.994165 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:43.994138 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03e17850-8d7a-4344-ad3f-eeff8ff1097d-tmp-dir\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:44.003330 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:44.003306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03e17850-8d7a-4344-ad3f-eeff8ff1097d-config-volume\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:44.008696 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:44.008672 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slfqh\" (UniqueName: \"kubernetes.io/projected/03e17850-8d7a-4344-ad3f-eeff8ff1097d-kube-api-access-slfqh\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:44.008903 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:44.008883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxh7w\" (UniqueName: \"kubernetes.io/projected/06b47e11-143b-4f36-b4a4-c16daaed8856-kube-api-access-rxh7w\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:44.497727 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:44.497694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:44.498472 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:44.497778 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:44.498472 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:44.497884 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:44.498472 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:44.497940 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert podName:06b47e11-143b-4f36-b4a4-c16daaed8856 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:45.497926065 +0000 UTC m=+34.114018316 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert") pod "ingress-canary-cbffr" (UID: "06b47e11-143b-4f36-b4a4-c16daaed8856") : secret "canary-serving-cert" not found Apr 22 17:34:44.498472 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:44.497882 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:44.498472 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:44.498036 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls podName:03e17850-8d7a-4344-ad3f-eeff8ff1097d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:45.498018647 +0000 UTC m=+34.114110905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls") pod "dns-default-jnl2w" (UID: "03e17850-8d7a-4344-ad3f-eeff8ff1097d") : secret "dns-default-metrics-tls" not found Apr 22 17:34:44.598289 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:44.598253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ds5g\" (UniqueName: \"kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g\") pod \"network-check-target-69d82\" (UID: \"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6\") " pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:44.598519 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:44.598332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:34:44.598519 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:44.598457 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:34:44.598628 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:44.598539 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs podName:81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:16.598517825 +0000 UTC m=+65.214610075 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs") pod "network-metrics-daemon-fd4w6" (UID: "81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8") : secret "metrics-daemon-secret" not found Apr 22 17:34:44.601193 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:44.601162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ds5g\" (UniqueName: \"kubernetes.io/projected/8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6-kube-api-access-6ds5g\") pod \"network-check-target-69d82\" (UID: \"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6\") " pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:44.863784 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:44.863694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:45.504953 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:45.504903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:45.505453 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:45.504993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:45.505453 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:45.505078 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:45.505453 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:45.505140 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:45.505453 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:45.505183 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert podName:06b47e11-143b-4f36-b4a4-c16daaed8856 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:47.505160285 +0000 UTC m=+36.121252597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert") pod "ingress-canary-cbffr" (UID: "06b47e11-143b-4f36-b4a4-c16daaed8856") : secret "canary-serving-cert" not found Apr 22 17:34:45.505453 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:45.505208 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls podName:03e17850-8d7a-4344-ad3f-eeff8ff1097d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:47.505196653 +0000 UTC m=+36.121288906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls") pod "dns-default-jnl2w" (UID: "03e17850-8d7a-4344-ad3f-eeff8ff1097d") : secret "dns-default-metrics-tls" not found Apr 22 17:34:45.851217 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:45.851044 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-69d82"] Apr 22 17:34:45.854931 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:45.854901 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d8d4bc2_3b24_4f90_96c4_3b43bade1eb6.slice/crio-27dac2a66dc5a4e1277d3a627f3ab0ba773bef7fec529430f6bc332f79a26ee3 WatchSource:0}: Error finding container 27dac2a66dc5a4e1277d3a627f3ab0ba773bef7fec529430f6bc332f79a26ee3: Status 404 returned error can't find the container with id 27dac2a66dc5a4e1277d3a627f3ab0ba773bef7fec529430f6bc332f79a26ee3 Apr 22 17:34:46.167778 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:46.167746 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-69d82" event={"ID":"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6","Type":"ContainerStarted","Data":"27dac2a66dc5a4e1277d3a627f3ab0ba773bef7fec529430f6bc332f79a26ee3"} Apr 22 17:34:46.170097 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:46.170076 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z45jm" event={"ID":"9db6f1f0-3473-4256-b588-7220e67210f2","Type":"ContainerStarted","Data":"4427d2bd6ceb8b1a8872ac12f06bfed5c5fb0497c1598e8d8abaa90f51f61b60"} Apr 22 17:34:47.175198 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:47.175163 2578 generic.go:358] "Generic (PLEG): container finished" podID="9db6f1f0-3473-4256-b588-7220e67210f2" containerID="4427d2bd6ceb8b1a8872ac12f06bfed5c5fb0497c1598e8d8abaa90f51f61b60" exitCode=0 Apr 22 17:34:47.175692 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:47.175219 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z45jm" event={"ID":"9db6f1f0-3473-4256-b588-7220e67210f2","Type":"ContainerDied","Data":"4427d2bd6ceb8b1a8872ac12f06bfed5c5fb0497c1598e8d8abaa90f51f61b60"} Apr 22 17:34:47.522870 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:47.522828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:47.523059 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:47.523013 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:47.523130 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:47.523052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:47.523130 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:47.523090 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls podName:03e17850-8d7a-4344-ad3f-eeff8ff1097d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:51.523068861 +0000 UTC m=+40.139161129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls") pod "dns-default-jnl2w" (UID: "03e17850-8d7a-4344-ad3f-eeff8ff1097d") : secret "dns-default-metrics-tls" not found Apr 22 17:34:47.523231 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:47.523141 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:47.523231 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:47.523196 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert podName:06b47e11-143b-4f36-b4a4-c16daaed8856 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:51.523180984 +0000 UTC m=+40.139273242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert") pod "ingress-canary-cbffr" (UID: "06b47e11-143b-4f36-b4a4-c16daaed8856") : secret "canary-serving-cert" not found Apr 22 17:34:48.181700 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:48.181666 2578 generic.go:358] "Generic (PLEG): container finished" podID="9db6f1f0-3473-4256-b588-7220e67210f2" containerID="97543a7196e95f804f5ccda2050d6c0968cf88ec761f84b66b0f860ddde12788" exitCode=0 Apr 22 17:34:48.182157 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:48.181733 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z45jm" event={"ID":"9db6f1f0-3473-4256-b588-7220e67210f2","Type":"ContainerDied","Data":"97543a7196e95f804f5ccda2050d6c0968cf88ec761f84b66b0f860ddde12788"} Apr 22 17:34:49.187003 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:49.186972 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z45jm" event={"ID":"9db6f1f0-3473-4256-b588-7220e67210f2","Type":"ContainerStarted","Data":"e20cc5c4ddc4b1d24e641d9e5e8b31d8ae384976d2a02695f926d46393f7c159"} Apr 22 17:34:49.217096 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:49.217054 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z45jm" podStartSLOduration=4.460948252 podStartE2EDuration="37.217041789s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:13.226452763 +0000 UTC m=+1.842545014" lastFinishedPulling="2026-04-22 17:34:45.9825463 +0000 UTC m=+34.598638551" observedRunningTime="2026-04-22 17:34:49.215504717 +0000 UTC m=+37.831596988" watchObservedRunningTime="2026-04-22 17:34:49.217041789 +0000 UTC m=+37.833134085" Apr 22 17:34:50.190063 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:50.190022 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-69d82" event={"ID":"8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6","Type":"ContainerStarted","Data":"1bfc1181b7a323432267a911a45eb7d82b759e3d98b0e6b3c745cb49e4f1f4c6"} Apr 22 17:34:50.190547 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:50.190436 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:34:50.215610 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:50.215568 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-69d82" podStartSLOduration=35.081052483 podStartE2EDuration="38.215553701s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:45.960756346 +0000 UTC m=+34.576848597" lastFinishedPulling="2026-04-22 17:34:49.095257564 +0000 UTC m=+37.711349815" observedRunningTime="2026-04-22 17:34:50.214371327 +0000 UTC m=+38.830463623" watchObservedRunningTime="2026-04-22 17:34:50.215553701 +0000 UTC m=+38.831645972" Apr 22 17:34:51.552486 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:51.552448 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:51.552973 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:51.552610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:51.552973 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:51.552615 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:51.552973 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:51.552705 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls podName:03e17850-8d7a-4344-ad3f-eeff8ff1097d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:59.552681826 +0000 UTC m=+48.168774090 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls") pod "dns-default-jnl2w" (UID: "03e17850-8d7a-4344-ad3f-eeff8ff1097d") : secret "dns-default-metrics-tls" not found Apr 22 17:34:51.552973 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:51.552715 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:51.552973 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:51.552758 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert podName:06b47e11-143b-4f36-b4a4-c16daaed8856 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:59.552743672 +0000 UTC m=+48.168835926 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert") pod "ingress-canary-cbffr" (UID: "06b47e11-143b-4f36-b4a4-c16daaed8856") : secret "canary-serving-cert" not found Apr 22 17:34:59.108626 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:59.108586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:59.112391 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:59.112368 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8086e0dd-7b2c-4fb8-bb9c-e3554698418a-original-pull-secret\") pod \"global-pull-secret-syncer-s4xq4\" (UID: \"8086e0dd-7b2c-4fb8-bb9c-e3554698418a\") " pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:59.277310 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:59.277280 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s4xq4" Apr 22 17:34:59.397921 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:59.397845 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s4xq4"] Apr 22 17:34:59.401071 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:34:59.401036 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8086e0dd_7b2c_4fb8_bb9c_e3554698418a.slice/crio-e44af58741da61e24387dd5215d8392a62b53cc04bbae3e5c23c367a2e5ac11b WatchSource:0}: Error finding container e44af58741da61e24387dd5215d8392a62b53cc04bbae3e5c23c367a2e5ac11b: Status 404 returned error can't find the container with id e44af58741da61e24387dd5215d8392a62b53cc04bbae3e5c23c367a2e5ac11b Apr 22 17:34:59.612263 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:59.612225 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:34:59.612474 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:34:59.612274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:34:59.612474 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:59.612378 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:59.612474 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:59.612383 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:59.612474 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:59.612458 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls podName:03e17850-8d7a-4344-ad3f-eeff8ff1097d nodeName:}" failed. No retries permitted until 2026-04-22 17:35:15.612437689 +0000 UTC m=+64.228529952 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls") pod "dns-default-jnl2w" (UID: "03e17850-8d7a-4344-ad3f-eeff8ff1097d") : secret "dns-default-metrics-tls" not found Apr 22 17:34:59.612474 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:34:59.612474 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert podName:06b47e11-143b-4f36-b4a4-c16daaed8856 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:15.612467566 +0000 UTC m=+64.228559816 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert") pod "ingress-canary-cbffr" (UID: "06b47e11-143b-4f36-b4a4-c16daaed8856") : secret "canary-serving-cert" not found Apr 22 17:35:00.210575 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:35:00.210537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s4xq4" event={"ID":"8086e0dd-7b2c-4fb8-bb9c-e3554698418a","Type":"ContainerStarted","Data":"e44af58741da61e24387dd5215d8392a62b53cc04bbae3e5c23c367a2e5ac11b"} Apr 22 17:35:03.218213 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:35:03.218179 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s4xq4" event={"ID":"8086e0dd-7b2c-4fb8-bb9c-e3554698418a","Type":"ContainerStarted","Data":"96756cb7a28aa2b25bbaa9669b841ab136c31ba0c53c7c2d0996a90916d5b304"} Apr 22 17:35:03.232527 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:35:03.232475 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-s4xq4" podStartSLOduration=32.576919666 podStartE2EDuration="36.232456716s" podCreationTimestamp="2026-04-22 17:34:27 +0000 UTC" firstStartedPulling="2026-04-22 17:34:59.402806553 +0000 UTC m=+48.018898803" lastFinishedPulling="2026-04-22 17:35:03.058343586 +0000 UTC m=+51.674435853" observedRunningTime="2026-04-22 17:35:03.232207928 +0000 UTC m=+51.848300200" watchObservedRunningTime="2026-04-22 17:35:03.232456716 +0000 UTC m=+51.848548989" Apr 22 17:35:11.166007 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:35:11.165980 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b8mb" Apr 22 17:35:15.619413 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:35:15.619377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:35:15.619817 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:35:15.619443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:35:15.619817 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:35:15.619530 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:15.619817 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:35:15.619534 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:15.619817 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:35:15.619581 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert podName:06b47e11-143b-4f36-b4a4-c16daaed8856 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:47.619568849 +0000 UTC m=+96.235661099 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert") pod "ingress-canary-cbffr" (UID: "06b47e11-143b-4f36-b4a4-c16daaed8856") : secret "canary-serving-cert" not found Apr 22 17:35:15.619817 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:35:15.619613 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls podName:03e17850-8d7a-4344-ad3f-eeff8ff1097d nodeName:}" failed. No retries permitted until 2026-04-22 17:35:47.619599352 +0000 UTC m=+96.235691605 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls") pod "dns-default-jnl2w" (UID: "03e17850-8d7a-4344-ad3f-eeff8ff1097d") : secret "dns-default-metrics-tls" not found Apr 22 17:35:16.626282 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:35:16.626244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:35:16.626792 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:35:16.626388 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:35:16.626792 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:35:16.626497 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs podName:81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:20.626475236 +0000 UTC m=+129.242567682 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs") pod "network-metrics-daemon-fd4w6" (UID: "81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8") : secret "metrics-daemon-secret" not found Apr 22 17:35:22.197409 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:35:22.197378 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-69d82" Apr 22 17:35:47.629190 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:35:47.629058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:35:47.629190 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:35:47.629103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:35:47.629190 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:35:47.629191 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:47.629679 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:35:47.629219 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:47.629679 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:35:47.629244 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls podName:03e17850-8d7a-4344-ad3f-eeff8ff1097d nodeName:}" failed. No retries permitted until 2026-04-22 17:36:51.629229365 +0000 UTC m=+160.245321615 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls") pod "dns-default-jnl2w" (UID: "03e17850-8d7a-4344-ad3f-eeff8ff1097d") : secret "dns-default-metrics-tls" not found Apr 22 17:35:47.629679 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:35:47.629296 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert podName:06b47e11-143b-4f36-b4a4-c16daaed8856 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:51.629277272 +0000 UTC m=+160.245369539 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert") pod "ingress-canary-cbffr" (UID: "06b47e11-143b-4f36-b4a4-c16daaed8856") : secret "canary-serving-cert" not found Apr 22 17:36:20.656891 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:20.656834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:36:20.657368 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:20.656980 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:36:20.657368 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:20.657071 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs podName:81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8 nodeName:}" failed. No retries permitted until 2026-04-22 17:38:22.657055022 +0000 UTC m=+251.273147273 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs") pod "network-metrics-daemon-fd4w6" (UID: "81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8") : secret "metrics-daemon-secret" not found Apr 22 17:36:23.186343 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.186307 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f"] Apr 22 17:36:23.188201 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.188185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f" Apr 22 17:36:23.193392 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.193363 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79c986ff86-dmsc9"] Apr 22 17:36:23.193532 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.193404 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 17:36:23.193532 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.193414 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:36:23.193725 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.193701 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-wrlzx\"" Apr 22 17:36:23.195227 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.195211 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.197363 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.197343 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f"] Apr 22 17:36:23.197727 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.197705 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hpj2f\"" Apr 22 17:36:23.197894 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.197726 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 17:36:23.197894 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.197705 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 17:36:23.197894 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.197749 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 17:36:23.198068 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.198042 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 17:36:23.198068 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.198056 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 17:36:23.198170 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.198126 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 17:36:23.210503 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.210482 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79c986ff86-dmsc9"] Apr 22 17:36:23.274484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.274455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfpb9\" (UniqueName: \"kubernetes.io/projected/c891add1-515f-45ef-bfb6-7d42a222721b-kube-api-access-jfpb9\") pod \"volume-data-source-validator-7c6cbb6c87-c2n2f\" (UID: \"c891add1-515f-45ef-bfb6-7d42a222721b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f" Apr 22 17:36:23.274626 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.274493 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.274626 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.274533 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-stats-auth\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.274626 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.274556 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhk5d\" (UniqueName: \"kubernetes.io/projected/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-kube-api-access-mhk5d\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.274626 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.274575 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.274792 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.274630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-default-certificate\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.375243 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.375212 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhk5d\" (UniqueName: \"kubernetes.io/projected/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-kube-api-access-mhk5d\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.375243 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.375247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.375452 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:23.375340 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:36:23.375452 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.375368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-default-certificate\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.375452 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:23.375396 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:23.875382453 +0000 UTC m=+132.491474703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : secret "router-metrics-certs-default" not found Apr 22 17:36:23.375613 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.375465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpb9\" (UniqueName: \"kubernetes.io/projected/c891add1-515f-45ef-bfb6-7d42a222721b-kube-api-access-jfpb9\") pod \"volume-data-source-validator-7c6cbb6c87-c2n2f\" (UID: \"c891add1-515f-45ef-bfb6-7d42a222721b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f" Apr 22 17:36:23.375613 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.375507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.375613 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.375538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-stats-auth\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.375751 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:23.375640 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:23.875628026 +0000 UTC m=+132.491720280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : configmap references non-existent config key: service-ca.crt Apr 22 17:36:23.377707 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.377682 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-stats-auth\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.377805 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.377718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-default-certificate\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.384178 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.384160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfpb9\" (UniqueName: \"kubernetes.io/projected/c891add1-515f-45ef-bfb6-7d42a222721b-kube-api-access-jfpb9\") pod \"volume-data-source-validator-7c6cbb6c87-c2n2f\" (UID: \"c891add1-515f-45ef-bfb6-7d42a222721b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f" Apr 22 17:36:23.384368 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.384354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhk5d\" (UniqueName: \"kubernetes.io/projected/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-kube-api-access-mhk5d\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.497374 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.497341 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f" Apr 22 17:36:23.609010 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.608976 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f"] Apr 22 17:36:23.612726 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:36:23.612698 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc891add1_515f_45ef_bfb6_7d42a222721b.slice/crio-7241c4f28f07f08aed543599a3d8f79db8e410429c3699afb8a1b5fb1e853e5a WatchSource:0}: Error finding container 7241c4f28f07f08aed543599a3d8f79db8e410429c3699afb8a1b5fb1e853e5a: Status 404 returned error can't find the container with id 7241c4f28f07f08aed543599a3d8f79db8e410429c3699afb8a1b5fb1e853e5a Apr 22 17:36:23.879108 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.879024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.879108 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:23.879078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:23.879282 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:23.879192 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:36:23.879282 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:23.879205 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:24.879185315 +0000 UTC m=+133.495277564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : configmap references non-existent config key: service-ca.crt Apr 22 17:36:23.879282 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:23.879236 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:24.879226054 +0000 UTC m=+133.495318316 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : secret "router-metrics-certs-default" not found Apr 22 17:36:24.375983 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:24.375947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f" event={"ID":"c891add1-515f-45ef-bfb6-7d42a222721b","Type":"ContainerStarted","Data":"7241c4f28f07f08aed543599a3d8f79db8e410429c3699afb8a1b5fb1e853e5a"} Apr 22 17:36:24.886322 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:24.886296 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:24.886500 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:24.886476 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:36:24.886582 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:24.886477 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:24.886582 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:24.886550 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:26.886529194 +0000 UTC m=+135.502621457 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : secret "router-metrics-certs-default" not found Apr 22 17:36:24.886582 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:24.886571 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:26.886563764 +0000 UTC m=+135.502656014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : configmap references non-existent config key: service-ca.crt Apr 22 17:36:25.378603 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:25.378565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f" event={"ID":"c891add1-515f-45ef-bfb6-7d42a222721b","Type":"ContainerStarted","Data":"90406363d0dcbc8add5fbde7a93e07d7441e6d56ba415421583fc051b6eb565e"} Apr 22 17:36:25.394991 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:25.394941 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-c2n2f" podStartSLOduration=1.1392097190000001 podStartE2EDuration="2.394925667s" podCreationTimestamp="2026-04-22 17:36:23 +0000 UTC" firstStartedPulling="2026-04-22 17:36:23.614453439 +0000 UTC m=+132.230545691" lastFinishedPulling="2026-04-22 17:36:24.870169374 +0000 UTC m=+133.486261639" observedRunningTime="2026-04-22 17:36:25.393533767 +0000 UTC m=+134.009626043" watchObservedRunningTime="2026-04-22 17:36:25.394925667 +0000 UTC m=+134.011017987" Apr 22 17:36:26.899811 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:26.899757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:26.899811 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:26.899830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:26.900353 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:26.899942 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:36:26.900353 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:26.899957 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:30.899934015 +0000 UTC m=+139.516026274 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : configmap references non-existent config key: service-ca.crt Apr 22 17:36:26.900353 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:26.899990 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:30.899977778 +0000 UTC m=+139.516070035 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : secret "router-metrics-certs-default" not found Apr 22 17:36:28.160671 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:28.160635 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cznmj_5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4/dns-node-resolver/0.log" Apr 22 17:36:29.160801 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:29.160766 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hspjm_26bfba8d-71c2-4440-8bc9-7e5759e58f9f/node-ca/0.log" Apr 22 17:36:30.927950 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:30.927915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:30.928335 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:30.927974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:30.928335 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:30.928081 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:36:30.928335 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:30.928103 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:38.928083835 +0000 UTC m=+147.544176089 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : configmap references non-existent config key: service-ca.crt Apr 22 17:36:30.928335 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:30.928126 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:38.928116595 +0000 UTC m=+147.544208845 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : secret "router-metrics-certs-default" not found Apr 22 17:36:33.185572 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.185534 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5"] Apr 22 17:36:33.187478 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.187457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.204622 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.204599 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:36:33.204622 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.204617 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 17:36:33.204751 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.204599 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-8lxp2\"" Apr 22 17:36:33.208751 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.208729 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 17:36:33.208849 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.208768 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 17:36:33.209060 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.209045 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5"] Apr 22 17:36:33.245192 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.245168 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqbtw\" (UniqueName: \"kubernetes.io/projected/de483a7f-5ee9-4932-9834-cd4e6a512d00-kube-api-access-bqbtw\") pod \"kube-storage-version-migrator-operator-6769c5d45-4gpk5\" (UID: \"de483a7f-5ee9-4932-9834-cd4e6a512d00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.245327 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.245259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de483a7f-5ee9-4932-9834-cd4e6a512d00-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4gpk5\" (UID: \"de483a7f-5ee9-4932-9834-cd4e6a512d00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.245327 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.245312 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de483a7f-5ee9-4932-9834-cd4e6a512d00-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4gpk5\" (UID: \"de483a7f-5ee9-4932-9834-cd4e6a512d00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.291718 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.291694 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77"] Apr 22 17:36:33.293558 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.293543 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.296244 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.296222 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 17:36:33.296354 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.296310 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 17:36:33.296354 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.296332 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 17:36:33.296354 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.296349 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:36:33.296546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.296413 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-lptn7\"" Apr 22 17:36:33.306338 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.306318 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77"] Apr 22 17:36:33.346156 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.346121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee226fb-9b7f-463e-b203-8630203ba5f9-config\") pod \"service-ca-operator-d6fc45fc5-bhd77\" (UID: \"fee226fb-9b7f-463e-b203-8630203ba5f9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.346330 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.346176 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee226fb-9b7f-463e-b203-8630203ba5f9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-bhd77\" (UID: \"fee226fb-9b7f-463e-b203-8630203ba5f9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.346330 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.346199 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmdm2\" (UniqueName: \"kubernetes.io/projected/fee226fb-9b7f-463e-b203-8630203ba5f9-kube-api-access-mmdm2\") pod \"service-ca-operator-d6fc45fc5-bhd77\" (UID: \"fee226fb-9b7f-463e-b203-8630203ba5f9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.346330 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.346265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqbtw\" (UniqueName: \"kubernetes.io/projected/de483a7f-5ee9-4932-9834-cd4e6a512d00-kube-api-access-bqbtw\") pod \"kube-storage-version-migrator-operator-6769c5d45-4gpk5\" (UID: \"de483a7f-5ee9-4932-9834-cd4e6a512d00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.346330 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.346304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de483a7f-5ee9-4932-9834-cd4e6a512d00-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4gpk5\" (UID: \"de483a7f-5ee9-4932-9834-cd4e6a512d00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.346727 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.346339 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de483a7f-5ee9-4932-9834-cd4e6a512d00-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4gpk5\" (UID: \"de483a7f-5ee9-4932-9834-cd4e6a512d00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.346885 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.346867 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de483a7f-5ee9-4932-9834-cd4e6a512d00-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4gpk5\" (UID: \"de483a7f-5ee9-4932-9834-cd4e6a512d00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.348581 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.348564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de483a7f-5ee9-4932-9834-cd4e6a512d00-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4gpk5\" (UID: \"de483a7f-5ee9-4932-9834-cd4e6a512d00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.354436 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.354399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqbtw\" (UniqueName: \"kubernetes.io/projected/de483a7f-5ee9-4932-9834-cd4e6a512d00-kube-api-access-bqbtw\") pod \"kube-storage-version-migrator-operator-6769c5d45-4gpk5\" (UID: \"de483a7f-5ee9-4932-9834-cd4e6a512d00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.447668 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.447566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee226fb-9b7f-463e-b203-8630203ba5f9-config\") pod \"service-ca-operator-d6fc45fc5-bhd77\" (UID: \"fee226fb-9b7f-463e-b203-8630203ba5f9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.447668 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.447622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee226fb-9b7f-463e-b203-8630203ba5f9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-bhd77\" (UID: \"fee226fb-9b7f-463e-b203-8630203ba5f9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.447668 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.447648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmdm2\" (UniqueName: \"kubernetes.io/projected/fee226fb-9b7f-463e-b203-8630203ba5f9-kube-api-access-mmdm2\") pod \"service-ca-operator-d6fc45fc5-bhd77\" (UID: \"fee226fb-9b7f-463e-b203-8630203ba5f9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.448120 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.448099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee226fb-9b7f-463e-b203-8630203ba5f9-config\") pod \"service-ca-operator-d6fc45fc5-bhd77\" (UID: \"fee226fb-9b7f-463e-b203-8630203ba5f9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.449911 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.449891 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee226fb-9b7f-463e-b203-8630203ba5f9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-bhd77\" (UID: \"fee226fb-9b7f-463e-b203-8630203ba5f9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.456436 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.456401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmdm2\" (UniqueName: \"kubernetes.io/projected/fee226fb-9b7f-463e-b203-8630203ba5f9-kube-api-access-mmdm2\") pod \"service-ca-operator-d6fc45fc5-bhd77\" (UID: \"fee226fb-9b7f-463e-b203-8630203ba5f9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.496049 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.496018 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" Apr 22 17:36:33.601959 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.601927 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" Apr 22 17:36:33.608061 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.608021 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5"] Apr 22 17:36:33.610366 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:36:33.610342 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde483a7f_5ee9_4932_9834_cd4e6a512d00.slice/crio-9538559621d8a283c9839fac7cda02d097e0b75a3dcdb33223e207d909ae427d WatchSource:0}: Error finding container 9538559621d8a283c9839fac7cda02d097e0b75a3dcdb33223e207d909ae427d: Status 404 returned error can't find the container with id 9538559621d8a283c9839fac7cda02d097e0b75a3dcdb33223e207d909ae427d Apr 22 17:36:33.717708 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:33.717636 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77"] Apr 22 17:36:33.720238 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:36:33.720212 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee226fb_9b7f_463e_b203_8630203ba5f9.slice/crio-dc84977106f481cc68d712f16071e53d5f09f4ee2b1f4e224fe747660d495a96 WatchSource:0}: Error finding container dc84977106f481cc68d712f16071e53d5f09f4ee2b1f4e224fe747660d495a96: Status 404 returned error can't find the container with id dc84977106f481cc68d712f16071e53d5f09f4ee2b1f4e224fe747660d495a96 Apr 22 17:36:34.400145 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:34.400092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" event={"ID":"de483a7f-5ee9-4932-9834-cd4e6a512d00","Type":"ContainerStarted","Data":"9538559621d8a283c9839fac7cda02d097e0b75a3dcdb33223e207d909ae427d"} Apr 22 17:36:34.401602 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:34.401572 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" event={"ID":"fee226fb-9b7f-463e-b203-8630203ba5f9","Type":"ContainerStarted","Data":"dc84977106f481cc68d712f16071e53d5f09f4ee2b1f4e224fe747660d495a96"} Apr 22 17:36:36.407663 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:36.407566 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" event={"ID":"fee226fb-9b7f-463e-b203-8630203ba5f9","Type":"ContainerStarted","Data":"6081b474052a3b5b8eec27a7b219d032cdaf0829fb79f4f4d9345000a6408df9"} Apr 22 17:36:36.408932 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:36.408898 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" event={"ID":"de483a7f-5ee9-4932-9834-cd4e6a512d00","Type":"ContainerStarted","Data":"c22a3608bbc2115e532ccd64a6fe4a830e3a7eca159c9462b3de9270fa0373f7"} Apr 22 17:36:36.433937 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:36.433893 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" podStartSLOduration=1.084923641 podStartE2EDuration="3.433880251s" podCreationTimestamp="2026-04-22 17:36:33 +0000 UTC" firstStartedPulling="2026-04-22 17:36:33.722085912 +0000 UTC m=+142.338178162" lastFinishedPulling="2026-04-22 17:36:36.071042518 +0000 UTC m=+144.687134772" observedRunningTime="2026-04-22 17:36:36.433159198 +0000 UTC m=+145.049251473" watchObservedRunningTime="2026-04-22 17:36:36.433880251 +0000 UTC m=+145.049972559" Apr 22 17:36:36.452291 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:36.452242 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" podStartSLOduration=0.995040071 podStartE2EDuration="3.452230268s" podCreationTimestamp="2026-04-22 17:36:33 +0000 UTC" firstStartedPulling="2026-04-22 17:36:33.612281743 +0000 UTC m=+142.228373992" lastFinishedPulling="2026-04-22 17:36:36.069471926 +0000 UTC m=+144.685564189" observedRunningTime="2026-04-22 17:36:36.451516581 +0000 UTC m=+145.067608855" watchObservedRunningTime="2026-04-22 17:36:36.452230268 +0000 UTC m=+145.068322540" Apr 22 17:36:38.996775 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:38.996734 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:38.997212 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:38.996831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:38.997212 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:38.996902 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:36:38.997212 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:38.996972 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:54.996952391 +0000 UTC m=+163.613044644 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : configmap references non-existent config key: service-ca.crt Apr 22 17:36:38.997212 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:38.996995 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs podName:aa98bb13-b13a-434c-8c1e-4bdce39c5b4a nodeName:}" failed. No retries permitted until 2026-04-22 17:36:54.996984939 +0000 UTC m=+163.613077191 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs") pod "router-default-79c986ff86-dmsc9" (UID: "aa98bb13-b13a-434c-8c1e-4bdce39c5b4a") : secret "router-metrics-certs-default" not found Apr 22 17:36:39.625566 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.625533 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-b68fk"] Apr 22 17:36:39.628860 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.628835 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.631034 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.631015 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:36:39.632155 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.632139 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:36:39.632239 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.632171 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:36:39.632239 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.632197 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:36:39.632324 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.632171 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-c22fx\"" Apr 22 17:36:39.635740 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.635637 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b68fk"] Apr 22 17:36:39.701994 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.701950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/51a32cd1-e242-4714-a778-917c371dfecb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.702201 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.702062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/51a32cd1-e242-4714-a778-917c371dfecb-crio-socket\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.702201 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.702092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/51a32cd1-e242-4714-a778-917c371dfecb-data-volume\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.702201 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.702136 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.702201 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.702180 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc9dl\" (UniqueName: \"kubernetes.io/projected/51a32cd1-e242-4714-a778-917c371dfecb-kube-api-access-pc9dl\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.802593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.802564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/51a32cd1-e242-4714-a778-917c371dfecb-crio-socket\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.802693 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.802597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/51a32cd1-e242-4714-a778-917c371dfecb-data-volume\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.802693 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.802636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.802693 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.802663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc9dl\" (UniqueName: \"kubernetes.io/projected/51a32cd1-e242-4714-a778-917c371dfecb-kube-api-access-pc9dl\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.802835 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.802701 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/51a32cd1-e242-4714-a778-917c371dfecb-crio-socket\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.802835 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.802722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/51a32cd1-e242-4714-a778-917c371dfecb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.802835 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:39.802774 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:39.802967 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:39.802860 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls podName:51a32cd1-e242-4714-a778-917c371dfecb nodeName:}" failed. No retries permitted until 2026-04-22 17:36:40.302836234 +0000 UTC m=+148.918928498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-b68fk" (UID: "51a32cd1-e242-4714-a778-917c371dfecb") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:39.802967 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.802934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/51a32cd1-e242-4714-a778-917c371dfecb-data-volume\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.803222 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.803205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/51a32cd1-e242-4714-a778-917c371dfecb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:39.811057 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:39.811038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc9dl\" (UniqueName: \"kubernetes.io/projected/51a32cd1-e242-4714-a778-917c371dfecb-kube-api-access-pc9dl\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:40.295495 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.295464 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-8m6xx"] Apr 22 17:36:40.297995 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.297975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.300473 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.300449 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 17:36:40.300595 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.300535 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 17:36:40.300659 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.300603 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 17:36:40.300659 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.300609 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 17:36:40.301498 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.301485 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-sst45\"" Apr 22 17:36:40.306300 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.306274 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-8m6xx"] Apr 22 17:36:40.306451 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.306408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:40.306568 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:40.306552 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:40.306635 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:40.306625 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls podName:51a32cd1-e242-4714-a778-917c371dfecb nodeName:}" failed. No retries permitted until 2026-04-22 17:36:41.306605388 +0000 UTC m=+149.922697654 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-b68fk" (UID: "51a32cd1-e242-4714-a778-917c371dfecb") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:40.407371 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.407330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee17db36-b7b3-4ebd-9b52-f9d69ff447ca-signing-key\") pod \"service-ca-865cb79987-8m6xx\" (UID: \"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca\") " pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.407574 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.407388 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pcmw\" (UniqueName: \"kubernetes.io/projected/ee17db36-b7b3-4ebd-9b52-f9d69ff447ca-kube-api-access-4pcmw\") pod \"service-ca-865cb79987-8m6xx\" (UID: \"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca\") " pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.407574 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.407505 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee17db36-b7b3-4ebd-9b52-f9d69ff447ca-signing-cabundle\") pod \"service-ca-865cb79987-8m6xx\" (UID: \"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca\") " pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.508401 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.508358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee17db36-b7b3-4ebd-9b52-f9d69ff447ca-signing-key\") pod \"service-ca-865cb79987-8m6xx\" (UID: \"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca\") " pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.508613 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.508437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pcmw\" (UniqueName: \"kubernetes.io/projected/ee17db36-b7b3-4ebd-9b52-f9d69ff447ca-kube-api-access-4pcmw\") pod \"service-ca-865cb79987-8m6xx\" (UID: \"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca\") " pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.508667 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.508640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee17db36-b7b3-4ebd-9b52-f9d69ff447ca-signing-cabundle\") pod \"service-ca-865cb79987-8m6xx\" (UID: \"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca\") " pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.509207 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.509189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee17db36-b7b3-4ebd-9b52-f9d69ff447ca-signing-cabundle\") pod \"service-ca-865cb79987-8m6xx\" (UID: \"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca\") " pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.510742 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.510720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee17db36-b7b3-4ebd-9b52-f9d69ff447ca-signing-key\") pod \"service-ca-865cb79987-8m6xx\" (UID: \"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca\") " pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.517186 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.517165 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pcmw\" (UniqueName: \"kubernetes.io/projected/ee17db36-b7b3-4ebd-9b52-f9d69ff447ca-kube-api-access-4pcmw\") pod \"service-ca-865cb79987-8m6xx\" (UID: \"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca\") " pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.611121 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.611042 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-8m6xx" Apr 22 17:36:40.725353 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:40.725312 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-8m6xx"] Apr 22 17:36:40.730157 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:36:40.730131 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee17db36_b7b3_4ebd_9b52_f9d69ff447ca.slice/crio-0afbf0d07877e74f26d0049c72c65a91e96e1635dea44c413af10ecd01b1e221 WatchSource:0}: Error finding container 0afbf0d07877e74f26d0049c72c65a91e96e1635dea44c413af10ecd01b1e221: Status 404 returned error can't find the container with id 0afbf0d07877e74f26d0049c72c65a91e96e1635dea44c413af10ecd01b1e221 Apr 22 17:36:41.315376 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:41.315341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:41.315778 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:41.315498 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:41.315778 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:41.315567 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls podName:51a32cd1-e242-4714-a778-917c371dfecb nodeName:}" failed. No retries permitted until 2026-04-22 17:36:43.315549848 +0000 UTC m=+151.931642102 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-b68fk" (UID: "51a32cd1-e242-4714-a778-917c371dfecb") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:41.424390 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:41.424355 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-8m6xx" event={"ID":"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca","Type":"ContainerStarted","Data":"83b5b61b9af94061089e2d1c41ec0c5fed3437339d2f00eaa5cceffe1520e149"} Apr 22 17:36:41.424390 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:41.424391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-8m6xx" event={"ID":"ee17db36-b7b3-4ebd-9b52-f9d69ff447ca","Type":"ContainerStarted","Data":"0afbf0d07877e74f26d0049c72c65a91e96e1635dea44c413af10ecd01b1e221"} Apr 22 17:36:41.443738 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:41.443691 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-8m6xx" podStartSLOduration=1.443675365 podStartE2EDuration="1.443675365s" podCreationTimestamp="2026-04-22 17:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:36:41.442908627 +0000 UTC m=+150.059000918" watchObservedRunningTime="2026-04-22 17:36:41.443675365 +0000 UTC m=+150.059767700" Apr 22 17:36:43.333989 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:43.333957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:43.334372 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:43.334093 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:43.334372 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:43.334160 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls podName:51a32cd1-e242-4714-a778-917c371dfecb nodeName:}" failed. No retries permitted until 2026-04-22 17:36:47.334144983 +0000 UTC m=+155.950237238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-b68fk" (UID: "51a32cd1-e242-4714-a778-917c371dfecb") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:46.816387 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:46.816347 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cbffr" podUID="06b47e11-143b-4f36-b4a4-c16daaed8856" Apr 22 17:36:46.837666 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:46.837631 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jnl2w" podUID="03e17850-8d7a-4344-ad3f-eeff8ff1097d" Apr 22 17:36:46.971025 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:36:46.970976 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-fd4w6" podUID="81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8" Apr 22 17:36:47.367555 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:47.367520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:47.369836 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:47.369810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/51a32cd1-e242-4714-a778-917c371dfecb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b68fk\" (UID: \"51a32cd1-e242-4714-a778-917c371dfecb\") " pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:47.438087 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:47.438052 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b68fk" Apr 22 17:36:47.439749 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:47.439729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:36:47.560351 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:47.560198 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b68fk"] Apr 22 17:36:47.563251 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:36:47.563223 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a32cd1_e242_4714_a778_917c371dfecb.slice/crio-5b65364a38fb7b26c9e8d9225826b2c2b3751a81dfa17dc8c26c85138fc5f5d2 WatchSource:0}: Error finding container 5b65364a38fb7b26c9e8d9225826b2c2b3751a81dfa17dc8c26c85138fc5f5d2: Status 404 returned error can't find the container with id 5b65364a38fb7b26c9e8d9225826b2c2b3751a81dfa17dc8c26c85138fc5f5d2 Apr 22 17:36:48.445734 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:48.445698 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b68fk" event={"ID":"51a32cd1-e242-4714-a778-917c371dfecb","Type":"ContainerStarted","Data":"573df8601677c5300226848c1e0b1c8e64c286f31e8aa75583d0c0752b4ce40a"} Apr 22 17:36:48.445734 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:48.445734 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b68fk" event={"ID":"51a32cd1-e242-4714-a778-917c371dfecb","Type":"ContainerStarted","Data":"05b1a92caba21def2707607121a91f40109537d75f7c743aadfab5f05a1cc9af"} Apr 22 17:36:48.446101 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:48.445747 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b68fk" event={"ID":"51a32cd1-e242-4714-a778-917c371dfecb","Type":"ContainerStarted","Data":"5b65364a38fb7b26c9e8d9225826b2c2b3751a81dfa17dc8c26c85138fc5f5d2"} Apr 22 17:36:50.452266 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:50.452224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b68fk" event={"ID":"51a32cd1-e242-4714-a778-917c371dfecb","Type":"ContainerStarted","Data":"8456ccbd27652b54855921777f8d8332e4a51503ea4ea71a5627508c3d225629"} Apr 22 17:36:50.469759 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:50.469713 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-b68fk" podStartSLOduration=9.23106234 podStartE2EDuration="11.469699482s" podCreationTimestamp="2026-04-22 17:36:39 +0000 UTC" firstStartedPulling="2026-04-22 17:36:47.65217082 +0000 UTC m=+156.268263071" lastFinishedPulling="2026-04-22 17:36:49.890807962 +0000 UTC m=+158.506900213" observedRunningTime="2026-04-22 17:36:50.468321516 +0000 UTC m=+159.084413789" watchObservedRunningTime="2026-04-22 17:36:50.469699482 +0000 UTC m=+159.085791755" Apr 22 17:36:51.703109 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:51.703053 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:36:51.703109 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:51.703128 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:36:51.705460 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:51.705404 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03e17850-8d7a-4344-ad3f-eeff8ff1097d-metrics-tls\") pod \"dns-default-jnl2w\" (UID: \"03e17850-8d7a-4344-ad3f-eeff8ff1097d\") " pod="openshift-dns/dns-default-jnl2w" Apr 22 17:36:51.705577 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:51.705414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b47e11-143b-4f36-b4a4-c16daaed8856-cert\") pod \"ingress-canary-cbffr\" (UID: \"06b47e11-143b-4f36-b4a4-c16daaed8856\") " pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:36:51.943321 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:51.943293 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dsn9k\"" Apr 22 17:36:51.951494 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:51.951472 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cbffr" Apr 22 17:36:52.063091 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:52.063056 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cbffr"] Apr 22 17:36:52.065656 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:36:52.065631 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06b47e11_143b_4f36_b4a4_c16daaed8856.slice/crio-a3223c3e8d83756ba25635f75346fc7abc524b17b2c6b4743ad4dc1b0ca39217 WatchSource:0}: Error finding container a3223c3e8d83756ba25635f75346fc7abc524b17b2c6b4743ad4dc1b0ca39217: Status 404 returned error can't find the container with id a3223c3e8d83756ba25635f75346fc7abc524b17b2c6b4743ad4dc1b0ca39217 Apr 22 17:36:52.457809 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:52.457772 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cbffr" event={"ID":"06b47e11-143b-4f36-b4a4-c16daaed8856","Type":"ContainerStarted","Data":"a3223c3e8d83756ba25635f75346fc7abc524b17b2c6b4743ad4dc1b0ca39217"} Apr 22 17:36:54.465073 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:54.465036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cbffr" event={"ID":"06b47e11-143b-4f36-b4a4-c16daaed8856","Type":"ContainerStarted","Data":"377d8843fef5ea052f1ca7201ec6dad92d3a913c115e1bbeafffddc45d2d1601"} Apr 22 17:36:54.482554 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:54.482505 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cbffr" podStartSLOduration=130.00556716 podStartE2EDuration="2m11.482490984s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:36:52.067561152 +0000 UTC m=+160.683653401" lastFinishedPulling="2026-04-22 17:36:53.54448497 +0000 UTC m=+162.160577225" observedRunningTime="2026-04-22 17:36:54.481745989 +0000 UTC m=+163.097838271" watchObservedRunningTime="2026-04-22 17:36:54.482490984 +0000 UTC m=+163.098583255" Apr 22 17:36:55.028212 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:55.028178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:55.028395 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:55.028225 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:55.029305 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:55.029283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-service-ca-bundle\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:55.030630 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:55.030610 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa98bb13-b13a-434c-8c1e-4bdce39c5b4a-metrics-certs\") pod \"router-default-79c986ff86-dmsc9\" (UID: \"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a\") " pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:55.304979 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:55.304876 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:55.423977 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:55.423943 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79c986ff86-dmsc9"] Apr 22 17:36:55.427472 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:36:55.427446 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa98bb13_b13a_434c_8c1e_4bdce39c5b4a.slice/crio-b8de2d56048df29304a09efcea303d8931e489deb10ebf7eba1c1c0ce6d592fd WatchSource:0}: Error finding container b8de2d56048df29304a09efcea303d8931e489deb10ebf7eba1c1c0ce6d592fd: Status 404 returned error can't find the container with id b8de2d56048df29304a09efcea303d8931e489deb10ebf7eba1c1c0ce6d592fd Apr 22 17:36:55.472066 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:55.472027 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79c986ff86-dmsc9" event={"ID":"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a","Type":"ContainerStarted","Data":"b8de2d56048df29304a09efcea303d8931e489deb10ebf7eba1c1c0ce6d592fd"} Apr 22 17:36:56.475180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:56.475138 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79c986ff86-dmsc9" event={"ID":"aa98bb13-b13a-434c-8c1e-4bdce39c5b4a","Type":"ContainerStarted","Data":"61c4b0331b43f6e66bdd26f96d74638b71b11e64fba9ac96c5ca70507cbe842c"} Apr 22 17:36:56.495648 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:56.495605 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79c986ff86-dmsc9" podStartSLOduration=33.495593021 podStartE2EDuration="33.495593021s" podCreationTimestamp="2026-04-22 17:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:36:56.494117233 +0000 UTC m=+165.110209516" watchObservedRunningTime="2026-04-22 17:36:56.495593021 +0000 UTC m=+165.111685292" Apr 22 17:36:57.306001 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:57.305965 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:57.308763 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:57.308738 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:57.477560 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:57.477529 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:57.478795 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:57.478777 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79c986ff86-dmsc9" Apr 22 17:36:57.954366 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:57.954342 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:36:59.746935 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.746909 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-875dc5ff8-svc6l"] Apr 22 17:36:59.749987 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.749966 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w"] Apr 22 17:36:59.750138 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.750118 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.752662 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.752644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w" Apr 22 17:36:59.752896 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.752879 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rmqdq\"" Apr 22 17:36:59.752982 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.752882 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:36:59.752982 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.752940 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:36:59.753086 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.752980 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:36:59.755414 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.755395 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-7zg5t\"" Apr 22 17:36:59.755700 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.755681 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 17:36:59.758236 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.758218 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:36:59.760134 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.760113 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w"] Apr 22 17:36:59.766484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.763091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-ca-trust-extracted\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.766484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.763141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-registry-tls\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.766484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.763177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-trusted-ca\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.766484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.763229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-bound-sa-token\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.766484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.763285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-image-registry-private-configuration\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.766484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.763321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-registry-certificates\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.766484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.763450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ca1b1673-b38d-41ee-ab4e-cb1e70bd3168-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9pr5w\" (UID: \"ca1b1673-b38d-41ee-ab4e-cb1e70bd3168\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w" Apr 22 17:36:59.766484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.763485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-installation-pull-secrets\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.766484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.763702 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvf5w\" (UniqueName: \"kubernetes.io/projected/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-kube-api-access-rvf5w\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.766484 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.765154 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-875dc5ff8-svc6l"] Apr 22 17:36:59.864816 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.864785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-image-registry-private-configuration\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.864943 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.864823 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-registry-certificates\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.864943 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.864858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ca1b1673-b38d-41ee-ab4e-cb1e70bd3168-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9pr5w\" (UID: \"ca1b1673-b38d-41ee-ab4e-cb1e70bd3168\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w" Apr 22 17:36:59.865053 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.865032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-installation-pull-secrets\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.865110 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.865098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvf5w\" (UniqueName: \"kubernetes.io/projected/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-kube-api-access-rvf5w\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.865162 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.865136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-ca-trust-extracted\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.865215 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.865163 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-registry-tls\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.865215 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.865186 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-trusted-ca\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.865324 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.865237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-bound-sa-token\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.865653 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.865627 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-ca-trust-extracted\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.866383 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.865821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-registry-certificates\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.867003 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.866974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-trusted-ca\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.867512 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.867457 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-installation-pull-secrets\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.867766 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.867739 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ca1b1673-b38d-41ee-ab4e-cb1e70bd3168-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9pr5w\" (UID: \"ca1b1673-b38d-41ee-ab4e-cb1e70bd3168\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w" Apr 22 17:36:59.867940 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.867919 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-image-registry-private-configuration\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.868046 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.868031 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-registry-tls\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.875167 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.875145 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvf5w\" (UniqueName: \"kubernetes.io/projected/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-kube-api-access-rvf5w\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:36:59.875467 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:36:59.875451 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/549cc9d6-4255-4c2c-afbf-754b2d91dcb4-bound-sa-token\") pod \"image-registry-875dc5ff8-svc6l\" (UID: \"549cc9d6-4255-4c2c-afbf-754b2d91dcb4\") " pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:37:00.062077 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:00.061988 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:37:00.072282 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:00.072257 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w" Apr 22 17:37:00.195189 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:00.195010 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-875dc5ff8-svc6l"] Apr 22 17:37:00.197651 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:37:00.197628 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549cc9d6_4255_4c2c_afbf_754b2d91dcb4.slice/crio-baf9078309d7bbd8f2ba96c8cdaa072c5db766d28e64d8053ab351e005bfbede WatchSource:0}: Error finding container baf9078309d7bbd8f2ba96c8cdaa072c5db766d28e64d8053ab351e005bfbede: Status 404 returned error can't find the container with id baf9078309d7bbd8f2ba96c8cdaa072c5db766d28e64d8053ab351e005bfbede Apr 22 17:37:00.206479 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:00.206459 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w"] Apr 22 17:37:00.210906 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:37:00.210881 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca1b1673_b38d_41ee_ab4e_cb1e70bd3168.slice/crio-a4cbaaa84a0effdf3e95e7e18f7952793817f5288200ad13dbae736a3eca5bfe WatchSource:0}: Error finding container a4cbaaa84a0effdf3e95e7e18f7952793817f5288200ad13dbae736a3eca5bfe: Status 404 returned error can't find the container with id a4cbaaa84a0effdf3e95e7e18f7952793817f5288200ad13dbae736a3eca5bfe Apr 22 17:37:00.484808 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:00.484771 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w" event={"ID":"ca1b1673-b38d-41ee-ab4e-cb1e70bd3168","Type":"ContainerStarted","Data":"a4cbaaa84a0effdf3e95e7e18f7952793817f5288200ad13dbae736a3eca5bfe"} Apr 22 17:37:00.486033 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:00.485997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" event={"ID":"549cc9d6-4255-4c2c-afbf-754b2d91dcb4","Type":"ContainerStarted","Data":"51d688d4a03d348fbfebaf9296da674c5875609baf8182c435089a9095446629"} Apr 22 17:37:00.486033 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:00.486028 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" event={"ID":"549cc9d6-4255-4c2c-afbf-754b2d91dcb4","Type":"ContainerStarted","Data":"baf9078309d7bbd8f2ba96c8cdaa072c5db766d28e64d8053ab351e005bfbede"} Apr 22 17:37:00.486195 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:00.486136 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:37:00.505178 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:00.505130 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" podStartSLOduration=1.505115398 podStartE2EDuration="1.505115398s" podCreationTimestamp="2026-04-22 17:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:37:00.504655877 +0000 UTC m=+169.120748148" watchObservedRunningTime="2026-04-22 17:37:00.505115398 +0000 UTC m=+169.121207671" Apr 22 17:37:01.489744 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.489709 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w" event={"ID":"ca1b1673-b38d-41ee-ab4e-cb1e70bd3168","Type":"ContainerStarted","Data":"4800a49f59492973d2c717367d40f2b866ce7c0b23cc09fe5b6b6fb92921f612"} Apr 22 17:37:01.490198 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.490072 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w" Apr 22 17:37:01.495588 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.495563 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w" Apr 22 17:37:01.504994 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.504952 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9pr5w" podStartSLOduration=1.456748545 podStartE2EDuration="2.504939807s" podCreationTimestamp="2026-04-22 17:36:59 +0000 UTC" firstStartedPulling="2026-04-22 17:37:00.212602649 +0000 UTC m=+168.828694899" lastFinishedPulling="2026-04-22 17:37:01.260793911 +0000 UTC m=+169.876886161" observedRunningTime="2026-04-22 17:37:01.503738701 +0000 UTC m=+170.119830985" watchObservedRunningTime="2026-04-22 17:37:01.504939807 +0000 UTC m=+170.121032079" Apr 22 17:37:01.679660 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.679576 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ql4f8"] Apr 22 17:37:01.682715 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.682693 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.685358 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.685333 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 17:37:01.685506 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.685446 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:37:01.685506 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.685461 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:37:01.685623 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.685586 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-7qhml\"" Apr 22 17:37:01.685682 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.685668 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:37:01.685738 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.685669 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 17:37:01.690934 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.690914 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ql4f8"] Apr 22 17:37:01.778083 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.778059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7f668ec-6531-4017-aa81-e044998d431f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.778226 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.778089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxx4z\" (UniqueName: \"kubernetes.io/projected/a7f668ec-6531-4017-aa81-e044998d431f-kube-api-access-kxx4z\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.778226 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.778123 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7f668ec-6531-4017-aa81-e044998d431f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.778226 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.778205 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f668ec-6531-4017-aa81-e044998d431f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.878695 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.878661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f668ec-6531-4017-aa81-e044998d431f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.878856 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.878741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7f668ec-6531-4017-aa81-e044998d431f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.878856 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.878773 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxx4z\" (UniqueName: \"kubernetes.io/projected/a7f668ec-6531-4017-aa81-e044998d431f-kube-api-access-kxx4z\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.878856 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:37:01.878808 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 17:37:01.879006 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.878810 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7f668ec-6531-4017-aa81-e044998d431f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.879006 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:37:01.878878 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f668ec-6531-4017-aa81-e044998d431f-prometheus-operator-tls podName:a7f668ec-6531-4017-aa81-e044998d431f nodeName:}" failed. No retries permitted until 2026-04-22 17:37:02.37885864 +0000 UTC m=+170.994950890 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/a7f668ec-6531-4017-aa81-e044998d431f-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-ql4f8" (UID: "a7f668ec-6531-4017-aa81-e044998d431f") : secret "prometheus-operator-tls" not found Apr 22 17:37:01.879460 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.879414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7f668ec-6531-4017-aa81-e044998d431f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.881236 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.881220 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7f668ec-6531-4017-aa81-e044998d431f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.888054 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.888031 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxx4z\" (UniqueName: \"kubernetes.io/projected/a7f668ec-6531-4017-aa81-e044998d431f-kube-api-access-kxx4z\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:01.955197 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.955131 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jnl2w" Apr 22 17:37:01.957931 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.957910 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7g4b5\"" Apr 22 17:37:01.965966 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:01.965952 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jnl2w" Apr 22 17:37:02.080441 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:02.080392 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jnl2w"] Apr 22 17:37:02.084099 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:37:02.084071 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e17850_8d7a_4344_ad3f_eeff8ff1097d.slice/crio-9c660f8d7223a7a312ce1877054bb8519fa49eeb426378cf8edae7a2f2b54841 WatchSource:0}: Error finding container 9c660f8d7223a7a312ce1877054bb8519fa49eeb426378cf8edae7a2f2b54841: Status 404 returned error can't find the container with id 9c660f8d7223a7a312ce1877054bb8519fa49eeb426378cf8edae7a2f2b54841 Apr 22 17:37:02.383492 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:02.383458 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f668ec-6531-4017-aa81-e044998d431f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:02.385960 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:02.385937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f668ec-6531-4017-aa81-e044998d431f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ql4f8\" (UID: \"a7f668ec-6531-4017-aa81-e044998d431f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:02.494481 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:02.494447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jnl2w" event={"ID":"03e17850-8d7a-4344-ad3f-eeff8ff1097d","Type":"ContainerStarted","Data":"9c660f8d7223a7a312ce1877054bb8519fa49eeb426378cf8edae7a2f2b54841"} Apr 22 17:37:02.592209 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:02.592178 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" Apr 22 17:37:02.702879 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:02.702848 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ql4f8"] Apr 22 17:37:02.705975 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:37:02.705949 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f668ec_6531_4017_aa81_e044998d431f.slice/crio-1d07aae0c962dd24099b5e428f5fac6a9342535127ece825a4a0a843dbe41565 WatchSource:0}: Error finding container 1d07aae0c962dd24099b5e428f5fac6a9342535127ece825a4a0a843dbe41565: Status 404 returned error can't find the container with id 1d07aae0c962dd24099b5e428f5fac6a9342535127ece825a4a0a843dbe41565 Apr 22 17:37:03.498657 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:03.498617 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" event={"ID":"a7f668ec-6531-4017-aa81-e044998d431f","Type":"ContainerStarted","Data":"1d07aae0c962dd24099b5e428f5fac6a9342535127ece825a4a0a843dbe41565"} Apr 22 17:37:04.503570 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:04.503533 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" event={"ID":"a7f668ec-6531-4017-aa81-e044998d431f","Type":"ContainerStarted","Data":"93eb83b8a65df4d2e715f6415ce9e88abe038e5e29ba101bfca427f34d9eecf9"} Apr 22 17:37:04.504013 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:04.503578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" event={"ID":"a7f668ec-6531-4017-aa81-e044998d431f","Type":"ContainerStarted","Data":"4f6722c6fdfc86eb75e80c6da36db68051c3e8b10d8f258bcefcfeddd06c3f9a"} Apr 22 17:37:04.505176 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:04.505153 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jnl2w" event={"ID":"03e17850-8d7a-4344-ad3f-eeff8ff1097d","Type":"ContainerStarted","Data":"2a99975fc65906ee76a244bc053eaaf446a74e993bba22edb64d293057cb51ae"} Apr 22 17:37:04.505240 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:04.505182 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jnl2w" event={"ID":"03e17850-8d7a-4344-ad3f-eeff8ff1097d","Type":"ContainerStarted","Data":"e95424e5da431be5bdc43477882b28a16802bd56d90c1f8abd436211ca3f3e6e"} Apr 22 17:37:04.505314 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:04.505295 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jnl2w" Apr 22 17:37:04.524551 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:04.524501 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-ql4f8" podStartSLOduration=2.116500114 podStartE2EDuration="3.52448698s" podCreationTimestamp="2026-04-22 17:37:01 +0000 UTC" firstStartedPulling="2026-04-22 17:37:02.707773521 +0000 UTC m=+171.323865772" lastFinishedPulling="2026-04-22 17:37:04.11576037 +0000 UTC m=+172.731852638" observedRunningTime="2026-04-22 17:37:04.522773666 +0000 UTC m=+173.138865938" watchObservedRunningTime="2026-04-22 17:37:04.52448698 +0000 UTC m=+173.140579252" Apr 22 17:37:04.539524 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:04.539481 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jnl2w" podStartSLOduration=139.510908509 podStartE2EDuration="2m21.539470244s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:37:02.085987213 +0000 UTC m=+170.702079462" lastFinishedPulling="2026-04-22 17:37:04.114548944 +0000 UTC m=+172.730641197" observedRunningTime="2026-04-22 17:37:04.539047926 +0000 UTC m=+173.155140198" watchObservedRunningTime="2026-04-22 17:37:04.539470244 +0000 UTC m=+173.155562512" Apr 22 17:37:05.957985 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:05.957949 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cf2sn"] Apr 22 17:37:05.961761 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:05.961739 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:05.964490 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:05.964460 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:37:05.964637 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:05.964500 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:37:05.964712 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:05.964460 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5mbzq\"" Apr 22 17:37:05.964894 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:05.964879 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:37:06.014440 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.014393 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1eb03-977d-435e-99be-1036b84441ec-metrics-client-ca\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.014607 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.014465 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-tls\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.014607 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.014491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-accelerators-collector-config\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.014607 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.014517 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-textfile\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.014785 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.014611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-wtmp\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.014785 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.014637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.014785 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.014658 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aaa1eb03-977d-435e-99be-1036b84441ec-root\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.014785 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.014680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aaa1eb03-977d-435e-99be-1036b84441ec-sys\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.014929 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.014782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx4zz\" (UniqueName: \"kubernetes.io/projected/aaa1eb03-977d-435e-99be-1036b84441ec-kube-api-access-bx4zz\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.016924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.016895 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kkhkg"] Apr 22 17:37:06.020553 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.020528 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.023270 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.023244 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 17:37:06.023543 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.023526 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 17:37:06.023849 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.023681 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-2mdjv\"" Apr 22 17:37:06.023849 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.023737 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 17:37:06.033511 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.033483 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kkhkg"] Apr 22 17:37:06.116026 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.115989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1eb03-977d-435e-99be-1036b84441ec-metrics-client-ca\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116203 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.116203 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-tls\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116203 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-accelerators-collector-config\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116203 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-textfile\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116203 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4641e6a-9871-4f14-bec6-666f704d5f1d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.116493 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-wtmp\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116493 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116493 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aaa1eb03-977d-435e-99be-1036b84441ec-root\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116493 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116308 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f4641e6a-9871-4f14-bec6-666f704d5f1d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.116493 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aaa1eb03-977d-435e-99be-1036b84441ec-root\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116493 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4skj9\" (UniqueName: \"kubernetes.io/projected/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-api-access-4skj9\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.116788 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-wtmp\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116788 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116532 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aaa1eb03-977d-435e-99be-1036b84441ec-sys\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116788 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116567 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aaa1eb03-977d-435e-99be-1036b84441ec-sys\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116788 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116574 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.116788 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116601 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-textfile\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116788 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx4zz\" (UniqueName: \"kubernetes.io/projected/aaa1eb03-977d-435e-99be-1036b84441ec-kube-api-access-bx4zz\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.116788 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116714 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.117128 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.116840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-accelerators-collector-config\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.117335 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.117316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1eb03-977d-435e-99be-1036b84441ec-metrics-client-ca\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.118845 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.118824 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-tls\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.118979 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.118863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aaa1eb03-977d-435e-99be-1036b84441ec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.139024 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.138996 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx4zz\" (UniqueName: \"kubernetes.io/projected/aaa1eb03-977d-435e-99be-1036b84441ec-kube-api-access-bx4zz\") pod \"node-exporter-cf2sn\" (UID: \"aaa1eb03-977d-435e-99be-1036b84441ec\") " pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.217443 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.217348 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.217443 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.217390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.217443 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.217439 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4641e6a-9871-4f14-bec6-666f704d5f1d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.217719 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:37:06.217522 2578 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 17:37:06.217719 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:37:06.217591 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-tls podName:f4641e6a-9871-4f14-bec6-666f704d5f1d nodeName:}" failed. No retries permitted until 2026-04-22 17:37:06.717568247 +0000 UTC m=+175.333660497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-kkhkg" (UID: "f4641e6a-9871-4f14-bec6-666f704d5f1d") : secret "kube-state-metrics-tls" not found Apr 22 17:37:06.217719 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.217675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f4641e6a-9871-4f14-bec6-666f704d5f1d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.217867 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.217711 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4skj9\" (UniqueName: \"kubernetes.io/projected/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-api-access-4skj9\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.217867 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.217754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.218055 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.218034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f4641e6a-9871-4f14-bec6-666f704d5f1d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.218218 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.218202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4641e6a-9871-4f14-bec6-666f704d5f1d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.218856 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.218840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.220088 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.220068 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.229634 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.229616 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4skj9\" (UniqueName: \"kubernetes.io/projected/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-api-access-4skj9\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.272024 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.272002 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cf2sn" Apr 22 17:37:06.279933 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:37:06.279907 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaa1eb03_977d_435e_99be_1036b84441ec.slice/crio-307fd6b50badabc924fe4beed94b01305ef86e1e99af9582f970d9bac0c1ceb7 WatchSource:0}: Error finding container 307fd6b50badabc924fe4beed94b01305ef86e1e99af9582f970d9bac0c1ceb7: Status 404 returned error can't find the container with id 307fd6b50badabc924fe4beed94b01305ef86e1e99af9582f970d9bac0c1ceb7 Apr 22 17:37:06.513798 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.513755 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cf2sn" event={"ID":"aaa1eb03-977d-435e-99be-1036b84441ec","Type":"ContainerStarted","Data":"307fd6b50badabc924fe4beed94b01305ef86e1e99af9582f970d9bac0c1ceb7"} Apr 22 17:37:06.722681 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.722643 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.725551 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.725527 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4641e6a-9871-4f14-bec6-666f704d5f1d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kkhkg\" (UID: \"f4641e6a-9871-4f14-bec6-666f704d5f1d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:06.931192 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:06.931111 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" Apr 22 17:37:07.091632 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:07.091601 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kkhkg"] Apr 22 17:37:07.094472 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:37:07.094441 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4641e6a_9871_4f14_bec6_666f704d5f1d.slice/crio-45a2d4c186b4d926d67f832bb6e83f6fc75b5d48cd5da79abef2c1ddf6df3a3f WatchSource:0}: Error finding container 45a2d4c186b4d926d67f832bb6e83f6fc75b5d48cd5da79abef2c1ddf6df3a3f: Status 404 returned error can't find the container with id 45a2d4c186b4d926d67f832bb6e83f6fc75b5d48cd5da79abef2c1ddf6df3a3f Apr 22 17:37:07.517679 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:07.517638 2578 generic.go:358] "Generic (PLEG): container finished" podID="aaa1eb03-977d-435e-99be-1036b84441ec" containerID="954f6f8697c912ce55f2a2316340e4073e9f9e5155cd3f3400f8b7ae97fe88da" exitCode=0 Apr 22 17:37:07.517862 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:07.517731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cf2sn" event={"ID":"aaa1eb03-977d-435e-99be-1036b84441ec","Type":"ContainerDied","Data":"954f6f8697c912ce55f2a2316340e4073e9f9e5155cd3f3400f8b7ae97fe88da"} Apr 22 17:37:07.518792 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:07.518767 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" event={"ID":"f4641e6a-9871-4f14-bec6-666f704d5f1d","Type":"ContainerStarted","Data":"45a2d4c186b4d926d67f832bb6e83f6fc75b5d48cd5da79abef2c1ddf6df3a3f"} Apr 22 17:37:08.532001 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:08.531962 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" event={"ID":"f4641e6a-9871-4f14-bec6-666f704d5f1d","Type":"ContainerStarted","Data":"108ecbaffde58d2ddca1992a8b3f83ae39b195c22cd0f7a4bb2b6de366ed43d9"} Apr 22 17:37:08.532001 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:08.532010 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" event={"ID":"f4641e6a-9871-4f14-bec6-666f704d5f1d","Type":"ContainerStarted","Data":"1a0bc809b2808a2ba93fc7a73bd5b52984509b860804b4d74b689ac52d9ded40"} Apr 22 17:37:08.532473 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:08.532026 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" event={"ID":"f4641e6a-9871-4f14-bec6-666f704d5f1d","Type":"ContainerStarted","Data":"fd8903eeec811c8ca902a49b053201a9bbc7963bbb09ea3f4077cca2005ed2ee"} Apr 22 17:37:08.533927 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:08.533901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cf2sn" event={"ID":"aaa1eb03-977d-435e-99be-1036b84441ec","Type":"ContainerStarted","Data":"7167b309322cdca1a6dc1d2291e67333bbb0f21ce6fa9b1eb47812a8407c8d1b"} Apr 22 17:37:08.534041 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:08.533932 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cf2sn" event={"ID":"aaa1eb03-977d-435e-99be-1036b84441ec","Type":"ContainerStarted","Data":"21864c3a7b3b34e2b66450fb85c18cf0b535af8bfa79514dc2bbe1bfb23ed65d"} Apr 22 17:37:08.550594 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:08.550546 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-kkhkg" podStartSLOduration=2.332309444 podStartE2EDuration="3.550533031s" podCreationTimestamp="2026-04-22 17:37:05 +0000 UTC" firstStartedPulling="2026-04-22 17:37:07.096316091 +0000 UTC m=+175.712408340" lastFinishedPulling="2026-04-22 17:37:08.314539677 +0000 UTC m=+176.930631927" observedRunningTime="2026-04-22 17:37:08.550133979 +0000 UTC m=+177.166226251" watchObservedRunningTime="2026-04-22 17:37:08.550533031 +0000 UTC m=+177.166625303" Apr 22 17:37:08.575997 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:08.575900 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cf2sn" podStartSLOduration=2.844570602 podStartE2EDuration="3.575884814s" podCreationTimestamp="2026-04-22 17:37:05 +0000 UTC" firstStartedPulling="2026-04-22 17:37:06.281701352 +0000 UTC m=+174.897793603" lastFinishedPulling="2026-04-22 17:37:07.013015561 +0000 UTC m=+175.629107815" observedRunningTime="2026-04-22 17:37:08.574467253 +0000 UTC m=+177.190559522" watchObservedRunningTime="2026-04-22 17:37:08.575884814 +0000 UTC m=+177.191977100" Apr 22 17:37:10.238257 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.238216 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-59fbcf48df-f4l4b"] Apr 22 17:37:10.241728 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.241705 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.244221 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.244197 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 17:37:10.245287 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.245262 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 17:37:10.245404 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.245356 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-14q728uc0e9or\"" Apr 22 17:37:10.245478 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.245458 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 17:37:10.245533 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.245506 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-7smm9\"" Apr 22 17:37:10.245533 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.245519 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 17:37:10.251306 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.251279 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59fbcf48df-f4l4b"] Apr 22 17:37:10.353174 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.353129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/05b049bd-566e-4b9b-b453-ee120360ea21-audit-log\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.353346 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.353197 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/05b049bd-566e-4b9b-b453-ee120360ea21-metrics-server-audit-profiles\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.353521 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.353484 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05b049bd-566e-4b9b-b453-ee120360ea21-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.353615 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.353592 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b049bd-566e-4b9b-b453-ee120360ea21-client-ca-bundle\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.353667 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.353630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/05b049bd-566e-4b9b-b453-ee120360ea21-secret-metrics-server-tls\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.353778 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.353671 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/05b049bd-566e-4b9b-b453-ee120360ea21-secret-metrics-server-client-certs\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.353778 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.353714 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grfr2\" (UniqueName: \"kubernetes.io/projected/05b049bd-566e-4b9b-b453-ee120360ea21-kube-api-access-grfr2\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.454483 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.454446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grfr2\" (UniqueName: \"kubernetes.io/projected/05b049bd-566e-4b9b-b453-ee120360ea21-kube-api-access-grfr2\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.454666 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.454496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/05b049bd-566e-4b9b-b453-ee120360ea21-audit-log\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.454666 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.454530 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/05b049bd-566e-4b9b-b453-ee120360ea21-metrics-server-audit-profiles\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.454666 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.454558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05b049bd-566e-4b9b-b453-ee120360ea21-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.454666 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.454598 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b049bd-566e-4b9b-b453-ee120360ea21-client-ca-bundle\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.454666 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.454618 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/05b049bd-566e-4b9b-b453-ee120360ea21-secret-metrics-server-tls\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.454666 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.454642 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/05b049bd-566e-4b9b-b453-ee120360ea21-secret-metrics-server-client-certs\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.455018 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.454993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/05b049bd-566e-4b9b-b453-ee120360ea21-audit-log\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.455846 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.455824 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05b049bd-566e-4b9b-b453-ee120360ea21-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.456116 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.456095 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/05b049bd-566e-4b9b-b453-ee120360ea21-metrics-server-audit-profiles\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.457775 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.457758 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b049bd-566e-4b9b-b453-ee120360ea21-client-ca-bundle\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.457775 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.457763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/05b049bd-566e-4b9b-b453-ee120360ea21-secret-metrics-server-client-certs\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.457939 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.457808 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/05b049bd-566e-4b9b-b453-ee120360ea21-secret-metrics-server-tls\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.463225 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.463197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grfr2\" (UniqueName: \"kubernetes.io/projected/05b049bd-566e-4b9b-b453-ee120360ea21-kube-api-access-grfr2\") pod \"metrics-server-59fbcf48df-f4l4b\" (UID: \"05b049bd-566e-4b9b-b453-ee120360ea21\") " pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.551856 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.551786 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:10.671039 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.670964 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59fbcf48df-f4l4b"] Apr 22 17:37:10.673801 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:37:10.673772 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05b049bd_566e_4b9b_b453_ee120360ea21.slice/crio-9f88e3aea698a2702330e0f896037e06a8c5988aff8026522d2b0a338c941565 WatchSource:0}: Error finding container 9f88e3aea698a2702330e0f896037e06a8c5988aff8026522d2b0a338c941565: Status 404 returned error can't find the container with id 9f88e3aea698a2702330e0f896037e06a8c5988aff8026522d2b0a338c941565 Apr 22 17:37:10.794577 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.794537 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq"] Apr 22 17:37:10.838403 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.838317 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq"] Apr 22 17:37:10.838590 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.838474 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq" Apr 22 17:37:10.840999 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.840973 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 17:37:10.841138 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.841044 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-kh6wk\"" Apr 22 17:37:10.858154 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.858128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/67c1a3d6-0553-48aa-997e-5b95f3a097b7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tcwpq\" (UID: \"67c1a3d6-0553-48aa-997e-5b95f3a097b7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq" Apr 22 17:37:10.959216 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.959174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/67c1a3d6-0553-48aa-997e-5b95f3a097b7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tcwpq\" (UID: \"67c1a3d6-0553-48aa-997e-5b95f3a097b7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq" Apr 22 17:37:10.961703 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:10.961674 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/67c1a3d6-0553-48aa-997e-5b95f3a097b7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tcwpq\" (UID: \"67c1a3d6-0553-48aa-997e-5b95f3a097b7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq" Apr 22 17:37:11.148396 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:11.148309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq" Apr 22 17:37:11.265248 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:11.265062 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq"] Apr 22 17:37:11.268250 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:37:11.268218 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c1a3d6_0553_48aa_997e_5b95f3a097b7.slice/crio-84a385bfc4a6c79453a513824ae093f13fa13cfa6bc672263c79a4fd0c290355 WatchSource:0}: Error finding container 84a385bfc4a6c79453a513824ae093f13fa13cfa6bc672263c79a4fd0c290355: Status 404 returned error can't find the container with id 84a385bfc4a6c79453a513824ae093f13fa13cfa6bc672263c79a4fd0c290355 Apr 22 17:37:11.542496 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:11.542456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq" event={"ID":"67c1a3d6-0553-48aa-997e-5b95f3a097b7","Type":"ContainerStarted","Data":"84a385bfc4a6c79453a513824ae093f13fa13cfa6bc672263c79a4fd0c290355"} Apr 22 17:37:11.543716 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:11.543675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" event={"ID":"05b049bd-566e-4b9b-b453-ee120360ea21","Type":"ContainerStarted","Data":"9f88e3aea698a2702330e0f896037e06a8c5988aff8026522d2b0a338c941565"} Apr 22 17:37:12.306655 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.306621 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:37:12.310695 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.310658 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.313325 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.313298 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 17:37:12.313618 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.313597 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 17:37:12.313796 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.313599 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 17:37:12.313872 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.313827 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 17:37:12.313924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.313599 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 17:37:12.313924 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.313778 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bsu8iqce299l4\"" Apr 22 17:37:12.314589 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.314186 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-g4dp2\"" Apr 22 17:37:12.314589 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.314282 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 17:37:12.314589 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.314383 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 17:37:12.317917 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.317892 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 17:37:12.318499 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.318125 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 17:37:12.318499 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.318341 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 17:37:12.319441 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.319391 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 17:37:12.321353 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.321330 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 17:37:12.327331 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.327305 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:37:12.372356 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372323 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw5m5\" (UniqueName: \"kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-kube-api-access-sw5m5\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372575 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372395 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372575 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372449 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372575 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372540 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372733 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372733 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372733 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372733 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372702 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-config\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372874 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372784 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372874 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372828 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372874 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372859 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-web-config\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372991 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372991 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372929 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.372991 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.372962 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.373105 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.373005 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-config-out\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.373105 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.373039 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.373175 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.373123 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.373175 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.373163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.475744 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.475702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.475935 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.475759 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.475935 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.475819 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw5m5\" (UniqueName: \"kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-kube-api-access-sw5m5\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.475935 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.475878 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.476274 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.476244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.476368 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.476314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.477016 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.476982 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.477137 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.477042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.477137 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.477078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.477137 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.477126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.477312 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.477160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-config\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.477312 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.477199 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.477312 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.477226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.477312 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.477255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-web-config\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.477312 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.477281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.477579 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.477312 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.478390 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.478257 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.479209 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.479181 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.480262 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.479382 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.480262 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.479457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.480262 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.479496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.480262 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.479529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-config-out\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.480262 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.479554 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.480262 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.479952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.480652 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.480402 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-config\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.480889 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.480866 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.481168 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.481139 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.481253 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.481155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.484233 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.481954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.484233 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.482286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.484233 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.482412 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-web-config\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.484233 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.483471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.484233 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.484053 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.485809 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.485767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.485968 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.485946 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw5m5\" (UniqueName: \"kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-kube-api-access-sw5m5\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.486066 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.486044 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-config-out\") pod \"prometheus-k8s-0\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.623446 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.623350 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:12.864451 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:12.864400 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:37:13.552943 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:13.552907 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerStarted","Data":"07eb4bbac0eaa3701f04de2eabf1a9e4b31016886923e2a09d86898ebc33473e"} Apr 22 17:37:13.554521 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:13.554489 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" event={"ID":"05b049bd-566e-4b9b-b453-ee120360ea21","Type":"ContainerStarted","Data":"9e6910328fbfb48c9c2500e1a19e995af8fea1906b72946cf9c9f1f0b3f3b614"} Apr 22 17:37:13.556004 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:13.555978 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq" event={"ID":"67c1a3d6-0553-48aa-997e-5b95f3a097b7","Type":"ContainerStarted","Data":"66579fa52cbce2c6be00d402fc06d8bea06325b7a2ee6301cfbfbf386ebcfd01"} Apr 22 17:37:13.556258 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:13.556233 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq" Apr 22 17:37:13.561633 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:13.561602 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq" Apr 22 17:37:13.573356 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:13.573301 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" podStartSLOduration=1.539230533 podStartE2EDuration="3.573288659s" podCreationTimestamp="2026-04-22 17:37:10 +0000 UTC" firstStartedPulling="2026-04-22 17:37:10.676199523 +0000 UTC m=+179.292291773" lastFinishedPulling="2026-04-22 17:37:12.710257634 +0000 UTC m=+181.326349899" observedRunningTime="2026-04-22 17:37:13.57197947 +0000 UTC m=+182.188071740" watchObservedRunningTime="2026-04-22 17:37:13.573288659 +0000 UTC m=+182.189380931" Apr 22 17:37:13.588546 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:13.588486 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tcwpq" podStartSLOduration=2.098161629 podStartE2EDuration="3.588471031s" podCreationTimestamp="2026-04-22 17:37:10 +0000 UTC" firstStartedPulling="2026-04-22 17:37:11.270681154 +0000 UTC m=+179.886773406" lastFinishedPulling="2026-04-22 17:37:12.760990541 +0000 UTC m=+181.377082808" observedRunningTime="2026-04-22 17:37:13.587228257 +0000 UTC m=+182.203320529" watchObservedRunningTime="2026-04-22 17:37:13.588471031 +0000 UTC m=+182.204563308" Apr 22 17:37:14.510782 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:14.510748 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jnl2w" Apr 22 17:37:14.562099 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:14.562062 2578 generic.go:358] "Generic (PLEG): container finished" podID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerID="635d9139a0bddc52b6e1c0236a345430d93d98e61e2cf219d88d9fa4bf66be13" exitCode=0 Apr 22 17:37:14.562562 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:14.562142 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerDied","Data":"635d9139a0bddc52b6e1c0236a345430d93d98e61e2cf219d88d9fa4bf66be13"} Apr 22 17:37:17.574821 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:17.574733 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerStarted","Data":"005c3c37f57bf4f1edd1d7f81976a0c67534afa815518f9bae585c2d595f73a5"} Apr 22 17:37:17.574821 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:17.574776 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerStarted","Data":"7b0508ea662ab2d545a0231c5c795204aa6209e9c8868dc0dbbf6a7ef3cf16fe"} Apr 22 17:37:19.583956 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:19.583919 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerStarted","Data":"d9ec68d2823af230da9bd4af6d3e0ccd29a6380673449afd72ec5b792a9fd0f1"} Apr 22 17:37:19.583956 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:19.583955 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerStarted","Data":"baa09ddc68920fd30589506f5df23ed530371d6b341fb24f171f837e7354f9cd"} Apr 22 17:37:19.583956 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:19.583965 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerStarted","Data":"829d8ee46d60a7c16bccff336e30765cd00c57c573488b315876e96467fb8896"} Apr 22 17:37:19.584395 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:19.583973 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerStarted","Data":"d5c12295d522d36f17e3b542aaa03387aba30262c3dda4bee7758e31da07e8bb"} Apr 22 17:37:19.616179 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:19.616133 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.3762297860000001 podStartE2EDuration="7.616118105s" podCreationTimestamp="2026-04-22 17:37:12 +0000 UTC" firstStartedPulling="2026-04-22 17:37:12.86974709 +0000 UTC m=+181.485839343" lastFinishedPulling="2026-04-22 17:37:19.109635412 +0000 UTC m=+187.725727662" observedRunningTime="2026-04-22 17:37:19.613520584 +0000 UTC m=+188.229612871" watchObservedRunningTime="2026-04-22 17:37:19.616118105 +0000 UTC m=+188.232210442" Apr 22 17:37:20.066095 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:20.066063 2578 patch_prober.go:28] interesting pod/image-registry-875dc5ff8-svc6l container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:37:20.066260 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:20.066116 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" podUID="549cc9d6-4255-4c2c-afbf-754b2d91dcb4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:37:21.493557 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:21.493523 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-875dc5ff8-svc6l" Apr 22 17:37:22.623941 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:22.623910 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:37:30.552329 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:30.552280 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:30.552329 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:30.552344 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:50.557784 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:50.557755 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:50.561886 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:50.561854 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-59fbcf48df-f4l4b" Apr 22 17:37:52.676309 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:52.676272 2578 generic.go:358] "Generic (PLEG): container finished" podID="fee226fb-9b7f-463e-b203-8630203ba5f9" containerID="6081b474052a3b5b8eec27a7b219d032cdaf0829fb79f4f4d9345000a6408df9" exitCode=0 Apr 22 17:37:52.676795 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:52.676319 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" event={"ID":"fee226fb-9b7f-463e-b203-8630203ba5f9","Type":"ContainerDied","Data":"6081b474052a3b5b8eec27a7b219d032cdaf0829fb79f4f4d9345000a6408df9"} Apr 22 17:37:52.676795 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:52.676634 2578 scope.go:117] "RemoveContainer" containerID="6081b474052a3b5b8eec27a7b219d032cdaf0829fb79f4f4d9345000a6408df9" Apr 22 17:37:53.680220 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:37:53.680186 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-bhd77" event={"ID":"fee226fb-9b7f-463e-b203-8630203ba5f9","Type":"ContainerStarted","Data":"bfcb79157fbefa3b57317ef3e3b7ee1a72b3d8911f3d0cb7f2c43324e0178e2a"} Apr 22 17:38:02.707290 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:02.707255 2578 generic.go:358] "Generic (PLEG): container finished" podID="de483a7f-5ee9-4932-9834-cd4e6a512d00" containerID="c22a3608bbc2115e532ccd64a6fe4a830e3a7eca159c9462b3de9270fa0373f7" exitCode=0 Apr 22 17:38:02.707686 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:02.707306 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" event={"ID":"de483a7f-5ee9-4932-9834-cd4e6a512d00","Type":"ContainerDied","Data":"c22a3608bbc2115e532ccd64a6fe4a830e3a7eca159c9462b3de9270fa0373f7"} Apr 22 17:38:02.707686 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:02.707644 2578 scope.go:117] "RemoveContainer" containerID="c22a3608bbc2115e532ccd64a6fe4a830e3a7eca159c9462b3de9270fa0373f7" Apr 22 17:38:03.711557 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:03.711520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4gpk5" event={"ID":"de483a7f-5ee9-4932-9834-cd4e6a512d00","Type":"ContainerStarted","Data":"f1d208771c8c2040cfc1eead560b4d323742d47d981102c24766272a8e752dc9"} Apr 22 17:38:12.624291 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:12.624251 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:12.640109 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:12.640080 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:12.751532 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:12.751504 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:22.713951 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:22.713904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:38:22.716106 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:22.716085 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8-metrics-certs\") pod \"network-metrics-daemon-fd4w6\" (UID: \"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8\") " pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:38:22.857954 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:22.857920 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kkblr\"" Apr 22 17:38:22.866098 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:22.866069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fd4w6" Apr 22 17:38:22.986564 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:22.986539 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fd4w6"] Apr 22 17:38:22.988855 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:38:22.988827 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81a52de8_56c0_4e61_b0af_a6b6fa4bbfe8.slice/crio-1e396aa9f31dff904b0d9ef0f0c914fc341276406a7499226609fbe2a67b227d WatchSource:0}: Error finding container 1e396aa9f31dff904b0d9ef0f0c914fc341276406a7499226609fbe2a67b227d: Status 404 returned error can't find the container with id 1e396aa9f31dff904b0d9ef0f0c914fc341276406a7499226609fbe2a67b227d Apr 22 17:38:23.771182 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:23.771131 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fd4w6" event={"ID":"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8","Type":"ContainerStarted","Data":"1e396aa9f31dff904b0d9ef0f0c914fc341276406a7499226609fbe2a67b227d"} Apr 22 17:38:24.776538 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:24.776501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fd4w6" event={"ID":"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8","Type":"ContainerStarted","Data":"0ef2d943bb339d12a72fd96739c119d20ff48e4622bc3a0e79859a54876209a7"} Apr 22 17:38:24.776923 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:24.776546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fd4w6" event={"ID":"81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8","Type":"ContainerStarted","Data":"a1cf036aca10cf8761f3fc9bd167bddbf83c927fe5b2c0d3a6f392d5d3df7804"} Apr 22 17:38:24.793851 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:24.793791 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fd4w6" podStartSLOduration=251.897755574 podStartE2EDuration="4m12.793770949s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:38:22.990734919 +0000 UTC m=+251.606827184" lastFinishedPulling="2026-04-22 17:38:23.886750302 +0000 UTC m=+252.502842559" observedRunningTime="2026-04-22 17:38:24.791687261 +0000 UTC m=+253.407779534" watchObservedRunningTime="2026-04-22 17:38:24.793770949 +0000 UTC m=+253.409863273" Apr 22 17:38:30.643844 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.643799 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:38:30.644825 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.644688 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="prometheus" containerID="cri-o://7b0508ea662ab2d545a0231c5c795204aa6209e9c8868dc0dbbf6a7ef3cf16fe" gracePeriod=600 Apr 22 17:38:30.644825 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.644688 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy" containerID="cri-o://baa09ddc68920fd30589506f5df23ed530371d6b341fb24f171f837e7354f9cd" gracePeriod=600 Apr 22 17:38:30.644825 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.644765 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy-thanos" containerID="cri-o://d9ec68d2823af230da9bd4af6d3e0ccd29a6380673449afd72ec5b792a9fd0f1" gracePeriod=600 Apr 22 17:38:30.645069 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.644853 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy-web" containerID="cri-o://829d8ee46d60a7c16bccff336e30765cd00c57c573488b315876e96467fb8896" gracePeriod=600 Apr 22 17:38:30.645069 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.644890 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="thanos-sidecar" containerID="cri-o://d5c12295d522d36f17e3b542aaa03387aba30262c3dda4bee7758e31da07e8bb" gracePeriod=600 Apr 22 17:38:30.645069 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.644765 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="config-reloader" containerID="cri-o://005c3c37f57bf4f1edd1d7f81976a0c67534afa815518f9bae585c2d595f73a5" gracePeriod=600 Apr 22 17:38:30.798685 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798654 2578 generic.go:358] "Generic (PLEG): container finished" podID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerID="d9ec68d2823af230da9bd4af6d3e0ccd29a6380673449afd72ec5b792a9fd0f1" exitCode=0 Apr 22 17:38:30.798685 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798679 2578 generic.go:358] "Generic (PLEG): container finished" podID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerID="baa09ddc68920fd30589506f5df23ed530371d6b341fb24f171f837e7354f9cd" exitCode=0 Apr 22 17:38:30.798685 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798685 2578 generic.go:358] "Generic (PLEG): container finished" podID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerID="829d8ee46d60a7c16bccff336e30765cd00c57c573488b315876e96467fb8896" exitCode=0 Apr 22 17:38:30.798685 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798691 2578 generic.go:358] "Generic (PLEG): container finished" podID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerID="d5c12295d522d36f17e3b542aaa03387aba30262c3dda4bee7758e31da07e8bb" exitCode=0 Apr 22 17:38:30.798685 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798696 2578 generic.go:358] "Generic (PLEG): container finished" podID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerID="005c3c37f57bf4f1edd1d7f81976a0c67534afa815518f9bae585c2d595f73a5" exitCode=0 Apr 22 17:38:30.798685 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798701 2578 generic.go:358] "Generic (PLEG): container finished" podID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerID="7b0508ea662ab2d545a0231c5c795204aa6209e9c8868dc0dbbf6a7ef3cf16fe" exitCode=0 Apr 22 17:38:30.799000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerDied","Data":"d9ec68d2823af230da9bd4af6d3e0ccd29a6380673449afd72ec5b792a9fd0f1"} Apr 22 17:38:30.799000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798787 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerDied","Data":"baa09ddc68920fd30589506f5df23ed530371d6b341fb24f171f837e7354f9cd"} Apr 22 17:38:30.799000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798798 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerDied","Data":"829d8ee46d60a7c16bccff336e30765cd00c57c573488b315876e96467fb8896"} Apr 22 17:38:30.799000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798806 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerDied","Data":"d5c12295d522d36f17e3b542aaa03387aba30262c3dda4bee7758e31da07e8bb"} Apr 22 17:38:30.799000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerDied","Data":"005c3c37f57bf4f1edd1d7f81976a0c67534afa815518f9bae585c2d595f73a5"} Apr 22 17:38:30.799000 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.798825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerDied","Data":"7b0508ea662ab2d545a0231c5c795204aa6209e9c8868dc0dbbf6a7ef3cf16fe"} Apr 22 17:38:30.884639 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.884605 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:30.986534 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.986503 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.986534 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.986544 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.986778 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.986574 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-trusted-ca-bundle\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.986778 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.986605 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-serving-certs-ca-bundle\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.986778 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.986632 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-tls-assets\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.986778 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.986661 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-thanos-prometheus-http-client-file\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.986984 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.986900 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-grpc-tls\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.986984 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.986958 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-rulefiles-0\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.987098 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.987001 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-config\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.987098 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.987040 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-db\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.987098 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.987060 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:30.987098 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.987086 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-kube-rbac-proxy\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.987288 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.987066 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:30.988222 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.987795 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-metrics-client-certs\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.988222 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.987847 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-metrics-client-ca\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.988222 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.987890 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-tls\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.988222 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.987939 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-kubelet-serving-ca-bundle\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.988222 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.987987 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-web-config\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.988222 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.988022 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw5m5\" (UniqueName: \"kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-kube-api-access-sw5m5\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.988222 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.988074 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-config-out\") pod \"82a81d18-8d61-4e72-b806-41d09ced8f7b\" (UID: \"82a81d18-8d61-4e72-b806-41d09ced8f7b\") " Apr 22 17:38:30.989480 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.989450 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:38:30.989612 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.989562 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:30.990008 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.989979 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:30.990101 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990033 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:30.990101 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990069 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:30.990208 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990165 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:30.990705 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990642 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:30.990705 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990674 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-trusted-ca-bundle\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:30.990705 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990701 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:30.990895 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990717 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-tls-assets\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:30.990895 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990733 2578 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-thanos-prometheus-http-client-file\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:30.990895 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990729 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-config" (OuterVolumeSpecName: "config") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:30.990895 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990737 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:30.990895 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990753 2578 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-grpc-tls\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:30.990895 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990771 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-db\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:30.990895 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.990786 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-metrics-client-ca\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:30.991764 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.991618 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:30.994544 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.992038 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:30.994544 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.992905 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-kube-api-access-sw5m5" (OuterVolumeSpecName: "kube-api-access-sw5m5") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "kube-api-access-sw5m5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:30.994823 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.994794 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:30.997036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.997009 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:30.998833 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.998661 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:31.000012 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:30.999990 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-config-out" (OuterVolumeSpecName: "config-out") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:38:31.005192 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.005172 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-web-config" (OuterVolumeSpecName: "web-config") pod "82a81d18-8d61-4e72-b806-41d09ced8f7b" (UID: "82a81d18-8d61-4e72-b806-41d09ced8f7b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:31.092154 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.092124 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-tls\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:31.092154 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.092153 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:31.092323 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.092165 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-web-config\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:31.092323 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.092177 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sw5m5\" (UniqueName: \"kubernetes.io/projected/82a81d18-8d61-4e72-b806-41d09ced8f7b-kube-api-access-sw5m5\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:31.092323 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.092187 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/82a81d18-8d61-4e72-b806-41d09ced8f7b-config-out\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:31.092323 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.092196 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:31.092323 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.092206 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/82a81d18-8d61-4e72-b806-41d09ced8f7b-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:31.092323 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.092215 2578 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-config\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:31.092323 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.092223 2578 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-kube-rbac-proxy\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:31.092323 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.092232 2578 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/82a81d18-8d61-4e72-b806-41d09ced8f7b-secret-metrics-client-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:38:31.805065 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.805026 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"82a81d18-8d61-4e72-b806-41d09ced8f7b","Type":"ContainerDied","Data":"07eb4bbac0eaa3701f04de2eabf1a9e4b31016886923e2a09d86898ebc33473e"} Apr 22 17:38:31.805533 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.805096 2578 scope.go:117] "RemoveContainer" containerID="d9ec68d2823af230da9bd4af6d3e0ccd29a6380673449afd72ec5b792a9fd0f1" Apr 22 17:38:31.805533 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.805181 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.813377 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.813356 2578 scope.go:117] "RemoveContainer" containerID="baa09ddc68920fd30589506f5df23ed530371d6b341fb24f171f837e7354f9cd" Apr 22 17:38:31.822182 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.822158 2578 scope.go:117] "RemoveContainer" containerID="829d8ee46d60a7c16bccff336e30765cd00c57c573488b315876e96467fb8896" Apr 22 17:38:31.829119 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.829098 2578 scope.go:117] "RemoveContainer" containerID="d5c12295d522d36f17e3b542aaa03387aba30262c3dda4bee7758e31da07e8bb" Apr 22 17:38:31.829731 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.829707 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:38:31.833300 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.833278 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:38:31.835918 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.835903 2578 scope.go:117] "RemoveContainer" containerID="005c3c37f57bf4f1edd1d7f81976a0c67534afa815518f9bae585c2d595f73a5" Apr 22 17:38:31.842011 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.841995 2578 scope.go:117] "RemoveContainer" containerID="7b0508ea662ab2d545a0231c5c795204aa6209e9c8868dc0dbbf6a7ef3cf16fe" Apr 22 17:38:31.848476 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.848457 2578 scope.go:117] "RemoveContainer" containerID="635d9139a0bddc52b6e1c0236a345430d93d98e61e2cf219d88d9fa4bf66be13" Apr 22 17:38:31.859621 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859599 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:38:31.859903 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859889 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="init-config-reloader" Apr 22 17:38:31.859903 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859903 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="init-config-reloader" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859912 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy-thanos" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859918 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy-thanos" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859927 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="config-reloader" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859932 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="config-reloader" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859942 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="thanos-sidecar" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859946 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="thanos-sidecar" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859959 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="prometheus" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859964 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="prometheus" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859970 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859975 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859981 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy-web" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.859986 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy-web" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.860027 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="prometheus" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.860036 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="config-reloader" Apr 22 17:38:31.860036 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.860043 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="thanos-sidecar" Apr 22 17:38:31.860584 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.860050 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy-web" Apr 22 17:38:31.860584 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.860056 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy" Apr 22 17:38:31.860584 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.860061 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" containerName="kube-rbac-proxy-thanos" Apr 22 17:38:31.865312 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.865294 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.867952 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.867934 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 17:38:31.868184 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.868167 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 17:38:31.868276 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.868259 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 17:38:31.868368 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.868325 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 17:38:31.868505 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.868382 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-g4dp2\"" Apr 22 17:38:31.868505 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.868442 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 17:38:31.868643 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.868505 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 17:38:31.868643 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.868542 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 17:38:31.868741 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.868640 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bsu8iqce299l4\"" Apr 22 17:38:31.868859 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.868843 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 17:38:31.869017 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.869001 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 17:38:31.869086 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.869001 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 17:38:31.871092 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.871073 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 17:38:31.873136 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.873116 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 17:38:31.882716 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.882673 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:38:31.898908 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.898883 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899008 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.898930 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/117ccd15-9480-4395-b6fd-e329836c6891-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899008 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.898988 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899075 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899024 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899075 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/117ccd15-9480-4395-b6fd-e329836c6891-config-out\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899136 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899169 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899210 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899247 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/117ccd15-9480-4395-b6fd-e329836c6891-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899297 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899336 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899325 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899380 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899415 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899382 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmzxp\" (UniqueName: \"kubernetes.io/projected/117ccd15-9480-4395-b6fd-e329836c6891-kube-api-access-xmzxp\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899415 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899409 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899494 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899464 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-web-config\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899539 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899539 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899518 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-config\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.899539 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.899532 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:31.955803 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:31.955723 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a81d18-8d61-4e72-b806-41d09ced8f7b" path="/var/lib/kubelet/pods/82a81d18-8d61-4e72-b806-41d09ced8f7b/volumes" Apr 22 17:38:32.000772 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.000744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.000772 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.000771 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/117ccd15-9480-4395-b6fd-e329836c6891-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.000943 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.000794 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.000943 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.000812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001044 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.000959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/117ccd15-9480-4395-b6fd-e329836c6891-config-out\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001044 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001140 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001140 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001073 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001140 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001115 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/117ccd15-9480-4395-b6fd-e329836c6891-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001288 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001288 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001288 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/117ccd15-9480-4395-b6fd-e329836c6891-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001288 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001288 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmzxp\" (UniqueName: \"kubernetes.io/projected/117ccd15-9480-4395-b6fd-e329836c6891-kube-api-access-xmzxp\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001288 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001598 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-web-config\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001598 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001373 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001598 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-config\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001598 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.001805 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.001630 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.002351 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.002325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.003590 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.003275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.003590 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.003529 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.004294 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.003889 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/117ccd15-9480-4395-b6fd-e329836c6891-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.005403 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.005363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/117ccd15-9480-4395-b6fd-e329836c6891-config-out\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.005584 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.005561 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.005646 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.005564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/117ccd15-9480-4395-b6fd-e329836c6891-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.005747 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.005729 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.005792 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.005763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.005951 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.005932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.006032 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.005994 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.006570 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.006550 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.006647 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.006579 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.006711 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.006692 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-config\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.007192 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.007177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/117ccd15-9480-4395-b6fd-e329836c6891-web-config\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.012666 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.012649 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmzxp\" (UniqueName: \"kubernetes.io/projected/117ccd15-9480-4395-b6fd-e329836c6891-kube-api-access-xmzxp\") pod \"prometheus-k8s-0\" (UID: \"117ccd15-9480-4395-b6fd-e329836c6891\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.175747 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.175717 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:38:32.300235 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.300212 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:38:32.303036 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:38:32.303012 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod117ccd15_9480_4395_b6fd_e329836c6891.slice/crio-0c1c85b70aab925912fdeb87fe5c5f160be277869db31a37267a8aa025ef55ea WatchSource:0}: Error finding container 0c1c85b70aab925912fdeb87fe5c5f160be277869db31a37267a8aa025ef55ea: Status 404 returned error can't find the container with id 0c1c85b70aab925912fdeb87fe5c5f160be277869db31a37267a8aa025ef55ea Apr 22 17:38:32.809491 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.809460 2578 generic.go:358] "Generic (PLEG): container finished" podID="117ccd15-9480-4395-b6fd-e329836c6891" containerID="73b61c472b9e361c7ec1067a06f3fa54a5d120c68e6b55bc8489f262472d5155" exitCode=0 Apr 22 17:38:32.809927 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.809557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"117ccd15-9480-4395-b6fd-e329836c6891","Type":"ContainerDied","Data":"73b61c472b9e361c7ec1067a06f3fa54a5d120c68e6b55bc8489f262472d5155"} Apr 22 17:38:32.809927 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:32.809598 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"117ccd15-9480-4395-b6fd-e329836c6891","Type":"ContainerStarted","Data":"0c1c85b70aab925912fdeb87fe5c5f160be277869db31a37267a8aa025ef55ea"} Apr 22 17:38:33.815708 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:33.815667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"117ccd15-9480-4395-b6fd-e329836c6891","Type":"ContainerStarted","Data":"0270c50d50840a11801f027bf0a9466c4fcde44f7677d3404109ea38d1dedc02"} Apr 22 17:38:33.815708 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:33.815709 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"117ccd15-9480-4395-b6fd-e329836c6891","Type":"ContainerStarted","Data":"52720e343157214997520bc4af582395657ad8a1d093c49fc0fb2d7fb06366fa"} Apr 22 17:38:33.816199 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:33.815721 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"117ccd15-9480-4395-b6fd-e329836c6891","Type":"ContainerStarted","Data":"e0882392cbfdf404790da29e453aa0ddd1ffca4a6d3142b0f259032e8dc7fbf5"} Apr 22 17:38:33.816199 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:33.815751 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"117ccd15-9480-4395-b6fd-e329836c6891","Type":"ContainerStarted","Data":"5c71b97e3b090f4d6fe0f73c6bd74d9a00d7111954ff40515659e23e87574ae7"} Apr 22 17:38:33.816199 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:33.815761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"117ccd15-9480-4395-b6fd-e329836c6891","Type":"ContainerStarted","Data":"8281718e5b8308a2873856830b7ec6d35f6aa69b8da5f54362e70e8d65b14d21"} Apr 22 17:38:33.816199 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:33.815774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"117ccd15-9480-4395-b6fd-e329836c6891","Type":"ContainerStarted","Data":"608c21d841c68173043d6c857a7a13021b1a1d4d3640383de0ca2ab4cb72d8f6"} Apr 22 17:38:33.851249 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:33.851190 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.851174285 podStartE2EDuration="2.851174285s" podCreationTimestamp="2026-04-22 17:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:38:33.84913197 +0000 UTC m=+262.465224268" watchObservedRunningTime="2026-04-22 17:38:33.851174285 +0000 UTC m=+262.467266556" Apr 22 17:38:37.176020 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:38:37.175984 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:11.864182 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:39:11.864136 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:39:11.865037 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:39:11.864928 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:39:11.873683 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:39:11.873666 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:39:32.175995 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:39:32.175957 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:32.191142 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:39:32.191118 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:39:33.006508 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:39:33.006479 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:43:20.354791 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.354707 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-959c974c-b9kdg"] Apr 22 17:43:20.358215 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.358193 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.360986 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.360965 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 17:43:20.361168 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.361150 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 17:43:20.362135 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.362118 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 17:43:20.362346 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.362329 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 17:43:20.362438 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.362348 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:43:20.362602 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.362587 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-9qxl4\"" Apr 22 17:43:20.376191 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.376169 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-959c974c-b9kdg"] Apr 22 17:43:20.409449 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.409400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/47424af5-704f-4afe-9c94-1c3dcbd3c65d-manager-config\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.409557 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.409476 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47424af5-704f-4afe-9c94-1c3dcbd3c65d-cert\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.409557 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.409532 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9fb7\" (UniqueName: \"kubernetes.io/projected/47424af5-704f-4afe-9c94-1c3dcbd3c65d-kube-api-access-g9fb7\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.409646 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.409584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/47424af5-704f-4afe-9c94-1c3dcbd3c65d-metrics-cert\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.510499 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.510469 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/47424af5-704f-4afe-9c94-1c3dcbd3c65d-metrics-cert\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.510662 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.510563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/47424af5-704f-4afe-9c94-1c3dcbd3c65d-manager-config\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.510662 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.510581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47424af5-704f-4afe-9c94-1c3dcbd3c65d-cert\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.510662 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.510612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9fb7\" (UniqueName: \"kubernetes.io/projected/47424af5-704f-4afe-9c94-1c3dcbd3c65d-kube-api-access-g9fb7\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.511266 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.511240 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/47424af5-704f-4afe-9c94-1c3dcbd3c65d-manager-config\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.513118 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.513098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47424af5-704f-4afe-9c94-1c3dcbd3c65d-cert\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.513409 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.513389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/47424af5-704f-4afe-9c94-1c3dcbd3c65d-metrics-cert\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.519165 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.519146 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9fb7\" (UniqueName: \"kubernetes.io/projected/47424af5-704f-4afe-9c94-1c3dcbd3c65d-kube-api-access-g9fb7\") pod \"lws-controller-manager-959c974c-b9kdg\" (UID: \"47424af5-704f-4afe-9c94-1c3dcbd3c65d\") " pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.668210 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.668125 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:20.796187 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.796153 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-959c974c-b9kdg"] Apr 22 17:43:20.799327 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:43:20.799294 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47424af5_704f_4afe_9c94_1c3dcbd3c65d.slice/crio-e95a33001f69ca8143bf42b16625297a48ef860265e3a675dd47620f16c50cad WatchSource:0}: Error finding container e95a33001f69ca8143bf42b16625297a48ef860265e3a675dd47620f16c50cad: Status 404 returned error can't find the container with id e95a33001f69ca8143bf42b16625297a48ef860265e3a675dd47620f16c50cad Apr 22 17:43:20.801154 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:20.801137 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:43:21.634825 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:21.634792 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" event={"ID":"47424af5-704f-4afe-9c94-1c3dcbd3c65d","Type":"ContainerStarted","Data":"e95a33001f69ca8143bf42b16625297a48ef860265e3a675dd47620f16c50cad"} Apr 22 17:43:23.646477 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:23.646437 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" event={"ID":"47424af5-704f-4afe-9c94-1c3dcbd3c65d","Type":"ContainerStarted","Data":"e2d8c33d46fa302daf6a0b90c83155ec009f06d7e529a961b74023c8b867f8b7"} Apr 22 17:43:23.646861 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:23.646604 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:23.666922 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:23.666841 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" podStartSLOduration=1.057203784 podStartE2EDuration="3.666829285s" podCreationTimestamp="2026-04-22 17:43:20 +0000 UTC" firstStartedPulling="2026-04-22 17:43:20.801267783 +0000 UTC m=+549.417360032" lastFinishedPulling="2026-04-22 17:43:23.410893282 +0000 UTC m=+552.026985533" observedRunningTime="2026-04-22 17:43:23.665532426 +0000 UTC m=+552.281624698" watchObservedRunningTime="2026-04-22 17:43:23.666829285 +0000 UTC m=+552.282921556" Apr 22 17:43:33.455987 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.455946 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl"] Apr 22 17:43:33.463113 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.463086 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.466084 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.466055 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 17:43:33.466243 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.466097 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-dtcs4\"" Apr 22 17:43:33.471738 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.471713 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl"] Apr 22 17:43:33.513226 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.513197 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.513320 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.513238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.513320 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.513257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.513400 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.513321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.513400 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.513377 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.513503 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.513404 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.513503 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.513483 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.513570 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.513511 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prdg\" (UniqueName: \"kubernetes.io/projected/47da7e04-76fe-40d1-ae27-6d0cd06117e7-kube-api-access-7prdg\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.513570 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.513534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614377 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614494 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614494 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614461 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614578 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614578 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614535 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7prdg\" (UniqueName: \"kubernetes.io/projected/47da7e04-76fe-40d1-ae27-6d0cd06117e7-kube-api-access-7prdg\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614695 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614757 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614728 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614757 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614739 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614854 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614854 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.614854 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.614850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.615041 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.615024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.615118 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.615098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.615399 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.615382 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.616806 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.616785 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.616942 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.616874 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.621998 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.621977 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/47da7e04-76fe-40d1-ae27-6d0cd06117e7-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.622078 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.621978 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prdg\" (UniqueName: \"kubernetes.io/projected/47da7e04-76fe-40d1-ae27-6d0cd06117e7-kube-api-access-7prdg\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-f9jrl\" (UID: \"47da7e04-76fe-40d1-ae27-6d0cd06117e7\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.777187 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.777165 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:33.900764 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:33.900741 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl"] Apr 22 17:43:33.904005 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:43:33.903976 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47da7e04_76fe_40d1_ae27_6d0cd06117e7.slice/crio-52b528d144d4c834594dfdc0d55278e040d7eac3d57bb70b47fa8a58696a9476 WatchSource:0}: Error finding container 52b528d144d4c834594dfdc0d55278e040d7eac3d57bb70b47fa8a58696a9476: Status 404 returned error can't find the container with id 52b528d144d4c834594dfdc0d55278e040d7eac3d57bb70b47fa8a58696a9476 Apr 22 17:43:34.651562 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:34.651536 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-959c974c-b9kdg" Apr 22 17:43:34.682171 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:34.682136 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" event={"ID":"47da7e04-76fe-40d1-ae27-6d0cd06117e7","Type":"ContainerStarted","Data":"52b528d144d4c834594dfdc0d55278e040d7eac3d57bb70b47fa8a58696a9476"} Apr 22 17:43:36.501296 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:36.501261 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 17:43:36.501583 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:36.501332 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 17:43:36.501583 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:36.501358 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 17:43:36.689238 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:36.689203 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" event={"ID":"47da7e04-76fe-40d1-ae27-6d0cd06117e7","Type":"ContainerStarted","Data":"fa41b8bc0a99efcdf45554e932c092f88bdd236bcc41db2fd36bcca364c9087c"} Apr 22 17:43:36.709436 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:36.709031 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" podStartSLOduration=1.113867519 podStartE2EDuration="3.709011847s" podCreationTimestamp="2026-04-22 17:43:33 +0000 UTC" firstStartedPulling="2026-04-22 17:43:33.90588459 +0000 UTC m=+562.521976844" lastFinishedPulling="2026-04-22 17:43:36.501028913 +0000 UTC m=+565.117121172" observedRunningTime="2026-04-22 17:43:36.708205127 +0000 UTC m=+565.324297409" watchObservedRunningTime="2026-04-22 17:43:36.709011847 +0000 UTC m=+565.325104118" Apr 22 17:43:36.778142 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:36.778071 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:37.781943 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:37.781917 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:38.694777 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:38.694743 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:38.695602 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:38.695582 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-f9jrl" Apr 22 17:43:56.886790 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:56.886757 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb"] Apr 22 17:43:56.889944 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:56.889928 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb" Apr 22 17:43:56.892663 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:56.892634 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 17:43:56.892663 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:56.892656 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 17:43:56.892818 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:56.892694 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 17:43:56.893653 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:56.893639 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-7jrzv\"" Apr 22 17:43:56.899457 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:56.899433 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb"] Apr 22 17:43:57.009190 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:57.009159 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr5kx\" (UniqueName: \"kubernetes.io/projected/c5620ed5-9eba-4be3-a0c8-2f2510b07893-kube-api-access-zr5kx\") pod \"dns-operator-controller-manager-844548ff4c-7prvb\" (UID: \"c5620ed5-9eba-4be3-a0c8-2f2510b07893\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb" Apr 22 17:43:57.110488 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:57.110460 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr5kx\" (UniqueName: \"kubernetes.io/projected/c5620ed5-9eba-4be3-a0c8-2f2510b07893-kube-api-access-zr5kx\") pod \"dns-operator-controller-manager-844548ff4c-7prvb\" (UID: \"c5620ed5-9eba-4be3-a0c8-2f2510b07893\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb" Apr 22 17:43:57.118801 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:57.118766 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr5kx\" (UniqueName: \"kubernetes.io/projected/c5620ed5-9eba-4be3-a0c8-2f2510b07893-kube-api-access-zr5kx\") pod \"dns-operator-controller-manager-844548ff4c-7prvb\" (UID: \"c5620ed5-9eba-4be3-a0c8-2f2510b07893\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb" Apr 22 17:43:57.201073 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:57.201007 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb" Apr 22 17:43:57.324409 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:57.324343 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb"] Apr 22 17:43:57.326700 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:43:57.326672 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5620ed5_9eba_4be3_a0c8_2f2510b07893.slice/crio-041a8cc94b0bd94ec18243bb8dc132285eb01949446e0bb3ec8191a29ead3c8f WatchSource:0}: Error finding container 041a8cc94b0bd94ec18243bb8dc132285eb01949446e0bb3ec8191a29ead3c8f: Status 404 returned error can't find the container with id 041a8cc94b0bd94ec18243bb8dc132285eb01949446e0bb3ec8191a29ead3c8f Apr 22 17:43:57.749147 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:43:57.749114 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb" event={"ID":"c5620ed5-9eba-4be3-a0c8-2f2510b07893","Type":"ContainerStarted","Data":"041a8cc94b0bd94ec18243bb8dc132285eb01949446e0bb3ec8191a29ead3c8f"} Apr 22 17:44:00.760843 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:00.760807 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb" event={"ID":"c5620ed5-9eba-4be3-a0c8-2f2510b07893","Type":"ContainerStarted","Data":"9c314710ed645eabd109e908f48e09a2306ccdfaa5a8845453152aae1bb13e42"} Apr 22 17:44:00.761213 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:00.760870 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb" Apr 22 17:44:00.779714 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:00.779672 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb" podStartSLOduration=2.06375533 podStartE2EDuration="4.779660704s" podCreationTimestamp="2026-04-22 17:43:56 +0000 UTC" firstStartedPulling="2026-04-22 17:43:57.328766397 +0000 UTC m=+585.944858647" lastFinishedPulling="2026-04-22 17:44:00.044671768 +0000 UTC m=+588.660764021" observedRunningTime="2026-04-22 17:44:00.778731145 +0000 UTC m=+589.394823416" watchObservedRunningTime="2026-04-22 17:44:00.779660704 +0000 UTC m=+589.395752975" Apr 22 17:44:06.951773 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:06.951736 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4"] Apr 22 17:44:06.954955 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:06.954940 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:06.957718 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:06.957696 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 17:44:06.957718 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:06.957712 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 17:44:06.957864 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:06.957712 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-64xhv\"" Apr 22 17:44:06.962622 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:06.962602 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4"] Apr 22 17:44:06.994839 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:06.994816 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-zfhk4\" (UID: \"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:06.994939 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:06.994855 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-zfhk4\" (UID: \"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:06.994939 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:06.994929 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlbm9\" (UniqueName: \"kubernetes.io/projected/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-kube-api-access-vlbm9\") pod \"kuadrant-console-plugin-6c886788f8-zfhk4\" (UID: \"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:07.095735 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:07.095710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-zfhk4\" (UID: \"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:07.095851 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:07.095741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-zfhk4\" (UID: \"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:07.095851 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:07.095774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlbm9\" (UniqueName: \"kubernetes.io/projected/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-kube-api-access-vlbm9\") pod \"kuadrant-console-plugin-6c886788f8-zfhk4\" (UID: \"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:07.095938 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:44:07.095867 2578 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 22 17:44:07.095938 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:44:07.095927 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-plugin-serving-cert podName:5f5c8748-e846-47b6-9305-a3dd7cf0ccb1 nodeName:}" failed. No retries permitted until 2026-04-22 17:44:07.595907836 +0000 UTC m=+596.212000095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-zfhk4" (UID: "5f5c8748-e846-47b6-9305-a3dd7cf0ccb1") : secret "plugin-serving-cert" not found Apr 22 17:44:07.096416 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:07.096396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-zfhk4\" (UID: \"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:07.104704 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:07.104678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlbm9\" (UniqueName: \"kubernetes.io/projected/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-kube-api-access-vlbm9\") pod \"kuadrant-console-plugin-6c886788f8-zfhk4\" (UID: \"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:07.599735 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:07.599702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-zfhk4\" (UID: \"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:07.602098 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:07.602074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f5c8748-e846-47b6-9305-a3dd7cf0ccb1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-zfhk4\" (UID: \"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:07.872645 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:07.872568 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" Apr 22 17:44:07.994765 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:07.994644 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4"] Apr 22 17:44:07.997360 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:44:07.997332 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f5c8748_e846_47b6_9305_a3dd7cf0ccb1.slice/crio-f0a0556df59c11b01386403177dd1c5860c6779849d828070c251c7d88bf3bd5 WatchSource:0}: Error finding container f0a0556df59c11b01386403177dd1c5860c6779849d828070c251c7d88bf3bd5: Status 404 returned error can't find the container with id f0a0556df59c11b01386403177dd1c5860c6779849d828070c251c7d88bf3bd5 Apr 22 17:44:08.789791 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:08.789754 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" event={"ID":"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1","Type":"ContainerStarted","Data":"f0a0556df59c11b01386403177dd1c5860c6779849d828070c251c7d88bf3bd5"} Apr 22 17:44:11.767358 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:11.767320 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-7prvb" Apr 22 17:44:12.552998 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:12.552972 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:44:12.553182 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:12.552972 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:44:12.806515 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:12.806448 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" event={"ID":"5f5c8748-e846-47b6-9305-a3dd7cf0ccb1","Type":"ContainerStarted","Data":"b09e1b84a2017d32b0cd756b12b97486d485718f79ed1130e1126e38dc6962ae"} Apr 22 17:44:12.825161 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:12.825112 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zfhk4" podStartSLOduration=2.223647146 podStartE2EDuration="6.825097607s" podCreationTimestamp="2026-04-22 17:44:06 +0000 UTC" firstStartedPulling="2026-04-22 17:44:07.998701477 +0000 UTC m=+596.614793733" lastFinishedPulling="2026-04-22 17:44:12.600151944 +0000 UTC m=+601.216244194" observedRunningTime="2026-04-22 17:44:12.824313184 +0000 UTC m=+601.440405479" watchObservedRunningTime="2026-04-22 17:44:12.825097607 +0000 UTC m=+601.441189881" Apr 22 17:44:51.458591 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.458558 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jmk42"] Apr 22 17:44:51.461949 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.461933 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:44:51.464463 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.464411 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 17:44:51.469404 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.469385 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jmk42"] Apr 22 17:44:51.546382 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.546353 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvb7n\" (UniqueName: \"kubernetes.io/projected/87b9fd07-06b4-4881-8e09-a2468e5a9b47-kube-api-access-pvb7n\") pod \"limitador-limitador-64c8f475fb-jmk42\" (UID: \"87b9fd07-06b4-4881-8e09-a2468e5a9b47\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:44:51.546554 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.546478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/87b9fd07-06b4-4881-8e09-a2468e5a9b47-config-file\") pod \"limitador-limitador-64c8f475fb-jmk42\" (UID: \"87b9fd07-06b4-4881-8e09-a2468e5a9b47\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:44:51.559152 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.559124 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jmk42"] Apr 22 17:44:51.647410 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.647371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/87b9fd07-06b4-4881-8e09-a2468e5a9b47-config-file\") pod \"limitador-limitador-64c8f475fb-jmk42\" (UID: \"87b9fd07-06b4-4881-8e09-a2468e5a9b47\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:44:51.647572 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.647442 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvb7n\" (UniqueName: \"kubernetes.io/projected/87b9fd07-06b4-4881-8e09-a2468e5a9b47-kube-api-access-pvb7n\") pod \"limitador-limitador-64c8f475fb-jmk42\" (UID: \"87b9fd07-06b4-4881-8e09-a2468e5a9b47\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:44:51.648017 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.647996 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/87b9fd07-06b4-4881-8e09-a2468e5a9b47-config-file\") pod \"limitador-limitador-64c8f475fb-jmk42\" (UID: \"87b9fd07-06b4-4881-8e09-a2468e5a9b47\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:44:51.655884 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.655860 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvb7n\" (UniqueName: \"kubernetes.io/projected/87b9fd07-06b4-4881-8e09-a2468e5a9b47-kube-api-access-pvb7n\") pod \"limitador-limitador-64c8f475fb-jmk42\" (UID: \"87b9fd07-06b4-4881-8e09-a2468e5a9b47\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:44:51.772952 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.772928 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:44:51.892827 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.892808 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jmk42"] Apr 22 17:44:51.895486 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:44:51.895452 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87b9fd07_06b4_4881_8e09_a2468e5a9b47.slice/crio-f13bec48f3a86591b1b8c77a9be6ba27ed0f66005d948c3cb63e9075d59c1afc WatchSource:0}: Error finding container f13bec48f3a86591b1b8c77a9be6ba27ed0f66005d948c3cb63e9075d59c1afc: Status 404 returned error can't find the container with id f13bec48f3a86591b1b8c77a9be6ba27ed0f66005d948c3cb63e9075d59c1afc Apr 22 17:44:51.930407 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:51.930382 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" event={"ID":"87b9fd07-06b4-4881-8e09-a2468e5a9b47","Type":"ContainerStarted","Data":"f13bec48f3a86591b1b8c77a9be6ba27ed0f66005d948c3cb63e9075d59c1afc"} Apr 22 17:44:52.453126 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:52.453092 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-9trm2"] Apr 22 17:44:52.458027 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:52.458009 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-9trm2" Apr 22 17:44:52.460603 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:52.460576 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-xp2rz\"" Apr 22 17:44:52.462971 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:52.462950 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-9trm2"] Apr 22 17:44:52.554804 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:52.554771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mmjh\" (UniqueName: \"kubernetes.io/projected/5f72de88-ecad-451d-aabc-6c7039376b1c-kube-api-access-6mmjh\") pod \"authorino-79cbc94b89-9trm2\" (UID: \"5f72de88-ecad-451d-aabc-6c7039376b1c\") " pod="kuadrant-system/authorino-79cbc94b89-9trm2" Apr 22 17:44:52.656139 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:52.656111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mmjh\" (UniqueName: \"kubernetes.io/projected/5f72de88-ecad-451d-aabc-6c7039376b1c-kube-api-access-6mmjh\") pod \"authorino-79cbc94b89-9trm2\" (UID: \"5f72de88-ecad-451d-aabc-6c7039376b1c\") " pod="kuadrant-system/authorino-79cbc94b89-9trm2" Apr 22 17:44:52.666327 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:52.666294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mmjh\" (UniqueName: \"kubernetes.io/projected/5f72de88-ecad-451d-aabc-6c7039376b1c-kube-api-access-6mmjh\") pod \"authorino-79cbc94b89-9trm2\" (UID: \"5f72de88-ecad-451d-aabc-6c7039376b1c\") " pod="kuadrant-system/authorino-79cbc94b89-9trm2" Apr 22 17:44:52.767948 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:52.767922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-9trm2" Apr 22 17:44:52.893401 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:52.893377 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-9trm2"] Apr 22 17:44:52.896275 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:44:52.896247 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f72de88_ecad_451d_aabc_6c7039376b1c.slice/crio-28332d4d5a58db9e2d7d62fa789606062235ac6597c39864598237efefc86be6 WatchSource:0}: Error finding container 28332d4d5a58db9e2d7d62fa789606062235ac6597c39864598237efefc86be6: Status 404 returned error can't find the container with id 28332d4d5a58db9e2d7d62fa789606062235ac6597c39864598237efefc86be6 Apr 22 17:44:52.936033 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:52.936007 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-9trm2" event={"ID":"5f72de88-ecad-451d-aabc-6c7039376b1c","Type":"ContainerStarted","Data":"28332d4d5a58db9e2d7d62fa789606062235ac6597c39864598237efefc86be6"} Apr 22 17:44:53.942130 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:53.942048 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" event={"ID":"87b9fd07-06b4-4881-8e09-a2468e5a9b47","Type":"ContainerStarted","Data":"ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d"} Apr 22 17:44:53.942646 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:53.942175 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:44:53.960051 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:53.959813 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" podStartSLOduration=1.6543443450000002 podStartE2EDuration="2.959796323s" podCreationTimestamp="2026-04-22 17:44:51 +0000 UTC" firstStartedPulling="2026-04-22 17:44:51.897240451 +0000 UTC m=+640.513332704" lastFinishedPulling="2026-04-22 17:44:53.202692432 +0000 UTC m=+641.818784682" observedRunningTime="2026-04-22 17:44:53.958461973 +0000 UTC m=+642.574554246" watchObservedRunningTime="2026-04-22 17:44:53.959796323 +0000 UTC m=+642.575888596" Apr 22 17:44:55.956774 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:44:55.956733 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-9trm2" event={"ID":"5f72de88-ecad-451d-aabc-6c7039376b1c","Type":"ContainerStarted","Data":"04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946"} Apr 22 17:45:04.949257 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:04.949226 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:45:04.966923 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:04.966881 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-9trm2" podStartSLOduration=10.549256325 podStartE2EDuration="12.966867767s" podCreationTimestamp="2026-04-22 17:44:52 +0000 UTC" firstStartedPulling="2026-04-22 17:44:52.897922004 +0000 UTC m=+641.514014254" lastFinishedPulling="2026-04-22 17:44:55.315533442 +0000 UTC m=+643.931625696" observedRunningTime="2026-04-22 17:44:55.970341313 +0000 UTC m=+644.586433585" watchObservedRunningTime="2026-04-22 17:45:04.966867767 +0000 UTC m=+653.582960044" Apr 22 17:45:09.374195 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:09.374162 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jmk42"] Apr 22 17:45:09.374606 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:09.374357 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" podUID="87b9fd07-06b4-4881-8e09-a2468e5a9b47" containerName="limitador" containerID="cri-o://ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d" gracePeriod=30 Apr 22 17:45:09.923494 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:09.923469 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:45:09.998365 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:09.998328 2578 generic.go:358] "Generic (PLEG): container finished" podID="87b9fd07-06b4-4881-8e09-a2468e5a9b47" containerID="ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d" exitCode=0 Apr 22 17:45:09.998592 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:09.998396 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" Apr 22 17:45:09.998592 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:09.998414 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" event={"ID":"87b9fd07-06b4-4881-8e09-a2468e5a9b47","Type":"ContainerDied","Data":"ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d"} Apr 22 17:45:09.998592 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:09.998476 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jmk42" event={"ID":"87b9fd07-06b4-4881-8e09-a2468e5a9b47","Type":"ContainerDied","Data":"f13bec48f3a86591b1b8c77a9be6ba27ed0f66005d948c3cb63e9075d59c1afc"} Apr 22 17:45:09.998592 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:09.998492 2578 scope.go:117] "RemoveContainer" containerID="ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d" Apr 22 17:45:10.006468 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:10.006449 2578 scope.go:117] "RemoveContainer" containerID="ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d" Apr 22 17:45:10.006743 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:45:10.006720 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d\": container with ID starting with ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d not found: ID does not exist" containerID="ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d" Apr 22 17:45:10.006805 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:10.006755 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d"} err="failed to get container status \"ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d\": rpc error: code = NotFound desc = could not find container \"ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d\": container with ID starting with ef52dde0abb1ad3bfdd3ec1c07e291a1aa8804864186b7f83c1d60fb7882190d not found: ID does not exist" Apr 22 17:45:10.095604 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:10.095562 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvb7n\" (UniqueName: \"kubernetes.io/projected/87b9fd07-06b4-4881-8e09-a2468e5a9b47-kube-api-access-pvb7n\") pod \"87b9fd07-06b4-4881-8e09-a2468e5a9b47\" (UID: \"87b9fd07-06b4-4881-8e09-a2468e5a9b47\") " Apr 22 17:45:10.095776 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:10.095615 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/87b9fd07-06b4-4881-8e09-a2468e5a9b47-config-file\") pod \"87b9fd07-06b4-4881-8e09-a2468e5a9b47\" (UID: \"87b9fd07-06b4-4881-8e09-a2468e5a9b47\") " Apr 22 17:45:10.096001 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:10.095974 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b9fd07-06b4-4881-8e09-a2468e5a9b47-config-file" (OuterVolumeSpecName: "config-file") pod "87b9fd07-06b4-4881-8e09-a2468e5a9b47" (UID: "87b9fd07-06b4-4881-8e09-a2468e5a9b47"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:45:10.097294 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:10.097014 2578 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/87b9fd07-06b4-4881-8e09-a2468e5a9b47-config-file\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:45:10.103002 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:10.102942 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b9fd07-06b4-4881-8e09-a2468e5a9b47-kube-api-access-pvb7n" (OuterVolumeSpecName: "kube-api-access-pvb7n") pod "87b9fd07-06b4-4881-8e09-a2468e5a9b47" (UID: "87b9fd07-06b4-4881-8e09-a2468e5a9b47"). InnerVolumeSpecName "kube-api-access-pvb7n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:45:10.198226 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:10.198188 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pvb7n\" (UniqueName: \"kubernetes.io/projected/87b9fd07-06b4-4881-8e09-a2468e5a9b47-kube-api-access-pvb7n\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:45:10.320570 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:10.320538 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jmk42"] Apr 22 17:45:10.322481 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:10.322458 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jmk42"] Apr 22 17:45:11.956380 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:11.956352 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87b9fd07-06b4-4881-8e09-a2468e5a9b47" path="/var/lib/kubelet/pods/87b9fd07-06b4-4881-8e09-a2468e5a9b47/volumes" Apr 22 17:45:17.956673 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:17.956631 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-8sj8h"] Apr 22 17:45:17.957127 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:17.956941 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87b9fd07-06b4-4881-8e09-a2468e5a9b47" containerName="limitador" Apr 22 17:45:17.957127 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:17.956952 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b9fd07-06b4-4881-8e09-a2468e5a9b47" containerName="limitador" Apr 22 17:45:17.957127 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:17.957017 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="87b9fd07-06b4-4881-8e09-a2468e5a9b47" containerName="limitador" Apr 22 17:45:17.961218 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:17.961200 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-8sj8h" Apr 22 17:45:17.963878 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:17.963852 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 17:45:17.964189 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:17.964161 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-8sj8h"] Apr 22 17:45:18.055006 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:18.054980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljlm2\" (UniqueName: \"kubernetes.io/projected/66200c04-ae3c-49cb-b6ed-9a4136c0bd46-kube-api-access-ljlm2\") pod \"authorino-68bd676465-8sj8h\" (UID: \"66200c04-ae3c-49cb-b6ed-9a4136c0bd46\") " pod="kuadrant-system/authorino-68bd676465-8sj8h" Apr 22 17:45:18.055006 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:18.055009 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/66200c04-ae3c-49cb-b6ed-9a4136c0bd46-tls-cert\") pod \"authorino-68bd676465-8sj8h\" (UID: \"66200c04-ae3c-49cb-b6ed-9a4136c0bd46\") " pod="kuadrant-system/authorino-68bd676465-8sj8h" Apr 22 17:45:18.155757 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:18.155730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlm2\" (UniqueName: \"kubernetes.io/projected/66200c04-ae3c-49cb-b6ed-9a4136c0bd46-kube-api-access-ljlm2\") pod \"authorino-68bd676465-8sj8h\" (UID: \"66200c04-ae3c-49cb-b6ed-9a4136c0bd46\") " pod="kuadrant-system/authorino-68bd676465-8sj8h" Apr 22 17:45:18.155757 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:18.155760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/66200c04-ae3c-49cb-b6ed-9a4136c0bd46-tls-cert\") pod \"authorino-68bd676465-8sj8h\" (UID: \"66200c04-ae3c-49cb-b6ed-9a4136c0bd46\") " pod="kuadrant-system/authorino-68bd676465-8sj8h" Apr 22 17:45:18.158174 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:18.158145 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/66200c04-ae3c-49cb-b6ed-9a4136c0bd46-tls-cert\") pod \"authorino-68bd676465-8sj8h\" (UID: \"66200c04-ae3c-49cb-b6ed-9a4136c0bd46\") " pod="kuadrant-system/authorino-68bd676465-8sj8h" Apr 22 17:45:18.163481 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:18.163458 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljlm2\" (UniqueName: \"kubernetes.io/projected/66200c04-ae3c-49cb-b6ed-9a4136c0bd46-kube-api-access-ljlm2\") pod \"authorino-68bd676465-8sj8h\" (UID: \"66200c04-ae3c-49cb-b6ed-9a4136c0bd46\") " pod="kuadrant-system/authorino-68bd676465-8sj8h" Apr 22 17:45:18.271064 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:18.271042 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-8sj8h" Apr 22 17:45:18.387757 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:18.387730 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-8sj8h"] Apr 22 17:45:18.389856 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:45:18.389830 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66200c04_ae3c_49cb_b6ed_9a4136c0bd46.slice/crio-ebc056bc9697054eb6beb059fb23b6c19f3ea7c0dc49bc182c6baaa43c2ffbe4 WatchSource:0}: Error finding container ebc056bc9697054eb6beb059fb23b6c19f3ea7c0dc49bc182c6baaa43c2ffbe4: Status 404 returned error can't find the container with id ebc056bc9697054eb6beb059fb23b6c19f3ea7c0dc49bc182c6baaa43c2ffbe4 Apr 22 17:45:19.029526 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:19.029446 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-8sj8h" event={"ID":"66200c04-ae3c-49cb-b6ed-9a4136c0bd46","Type":"ContainerStarted","Data":"7091e6001d0426c30a79eac0d8cb3b0f153f32e255ce62f4b15b444bdda37735"} Apr 22 17:45:19.029526 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:19.029481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-8sj8h" event={"ID":"66200c04-ae3c-49cb-b6ed-9a4136c0bd46","Type":"ContainerStarted","Data":"ebc056bc9697054eb6beb059fb23b6c19f3ea7c0dc49bc182c6baaa43c2ffbe4"} Apr 22 17:45:19.046164 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:19.046115 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-8sj8h" podStartSLOduration=1.6987083429999998 podStartE2EDuration="2.046102068s" podCreationTimestamp="2026-04-22 17:45:17 +0000 UTC" firstStartedPulling="2026-04-22 17:45:18.391147873 +0000 UTC m=+667.007240126" lastFinishedPulling="2026-04-22 17:45:18.738541599 +0000 UTC m=+667.354633851" observedRunningTime="2026-04-22 17:45:19.044796209 +0000 UTC m=+667.660888482" watchObservedRunningTime="2026-04-22 17:45:19.046102068 +0000 UTC m=+667.662194339" Apr 22 17:45:19.068455 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:19.068397 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-9trm2"] Apr 22 17:45:19.068674 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:19.068620 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-9trm2" podUID="5f72de88-ecad-451d-aabc-6c7039376b1c" containerName="authorino" containerID="cri-o://04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946" gracePeriod=30 Apr 22 17:45:19.305445 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:19.305405 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-9trm2" Apr 22 17:45:19.364885 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:19.364854 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mmjh\" (UniqueName: \"kubernetes.io/projected/5f72de88-ecad-451d-aabc-6c7039376b1c-kube-api-access-6mmjh\") pod \"5f72de88-ecad-451d-aabc-6c7039376b1c\" (UID: \"5f72de88-ecad-451d-aabc-6c7039376b1c\") " Apr 22 17:45:19.366758 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:19.366731 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f72de88-ecad-451d-aabc-6c7039376b1c-kube-api-access-6mmjh" (OuterVolumeSpecName: "kube-api-access-6mmjh") pod "5f72de88-ecad-451d-aabc-6c7039376b1c" (UID: "5f72de88-ecad-451d-aabc-6c7039376b1c"). InnerVolumeSpecName "kube-api-access-6mmjh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:45:19.466053 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:19.466028 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6mmjh\" (UniqueName: \"kubernetes.io/projected/5f72de88-ecad-451d-aabc-6c7039376b1c-kube-api-access-6mmjh\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:45:20.033847 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:20.033813 2578 generic.go:358] "Generic (PLEG): container finished" podID="5f72de88-ecad-451d-aabc-6c7039376b1c" containerID="04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946" exitCode=0 Apr 22 17:45:20.034252 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:20.033881 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-9trm2" Apr 22 17:45:20.034252 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:20.033900 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-9trm2" event={"ID":"5f72de88-ecad-451d-aabc-6c7039376b1c","Type":"ContainerDied","Data":"04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946"} Apr 22 17:45:20.034252 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:20.033938 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-9trm2" event={"ID":"5f72de88-ecad-451d-aabc-6c7039376b1c","Type":"ContainerDied","Data":"28332d4d5a58db9e2d7d62fa789606062235ac6597c39864598237efefc86be6"} Apr 22 17:45:20.034252 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:20.033952 2578 scope.go:117] "RemoveContainer" containerID="04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946" Apr 22 17:45:20.041942 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:20.041925 2578 scope.go:117] "RemoveContainer" containerID="04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946" Apr 22 17:45:20.042163 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:45:20.042146 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946\": container with ID starting with 04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946 not found: ID does not exist" containerID="04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946" Apr 22 17:45:20.042213 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:20.042168 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946"} err="failed to get container status \"04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946\": rpc error: code = NotFound desc = could not find container \"04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946\": container with ID starting with 04d0aa1e494df263a90dacd766788a82a84d16320e411d0b50e7d5ad6be7b946 not found: ID does not exist" Apr 22 17:45:20.049697 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:20.049678 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-9trm2"] Apr 22 17:45:20.054865 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:20.054846 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-9trm2"] Apr 22 17:45:21.956445 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:21.956391 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f72de88-ecad-451d-aabc-6c7039376b1c" path="/var/lib/kubelet/pods/5f72de88-ecad-451d-aabc-6c7039376b1c/volumes" Apr 22 17:45:36.525238 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.525204 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-bh764"] Apr 22 17:45:36.525704 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.525538 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f72de88-ecad-451d-aabc-6c7039376b1c" containerName="authorino" Apr 22 17:45:36.525704 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.525550 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f72de88-ecad-451d-aabc-6c7039376b1c" containerName="authorino" Apr 22 17:45:36.525704 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.525613 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f72de88-ecad-451d-aabc-6c7039376b1c" containerName="authorino" Apr 22 17:45:36.531878 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.531858 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:45:36.539590 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.539565 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 17:45:36.539688 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.539642 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 17:45:36.540815 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.540795 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-czlrh\"" Apr 22 17:45:36.540930 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.540795 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 17:45:36.544835 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.544813 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-bh764"] Apr 22 17:45:36.551129 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.551106 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb"] Apr 22 17:45:36.554162 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.554144 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:45:36.556754 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.556736 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 17:45:36.557154 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.557132 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-rwfsm\"" Apr 22 17:45:36.565371 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.565351 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb"] Apr 22 17:45:36.601481 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.601394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx2z7\" (UniqueName: \"kubernetes.io/projected/e80d7bec-3fdb-409b-8d0a-e773911de5eb-kube-api-access-rx2z7\") pod \"kserve-controller-manager-84ffddfb66-bh764\" (UID: \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\") " pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:45:36.601603 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.601520 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkt4l\" (UniqueName: \"kubernetes.io/projected/c7a40edd-751b-4af7-826c-4f2e8845c135-kube-api-access-kkt4l\") pod \"llmisvc-controller-manager-6954c7fbdf-gcrkb\" (UID: \"c7a40edd-751b-4af7-826c-4f2e8845c135\") " pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:45:36.601603 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.601570 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7a40edd-751b-4af7-826c-4f2e8845c135-cert\") pod \"llmisvc-controller-manager-6954c7fbdf-gcrkb\" (UID: \"c7a40edd-751b-4af7-826c-4f2e8845c135\") " pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:45:36.601603 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.601594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e80d7bec-3fdb-409b-8d0a-e773911de5eb-cert\") pod \"kserve-controller-manager-84ffddfb66-bh764\" (UID: \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\") " pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:45:36.703022 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.702992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx2z7\" (UniqueName: \"kubernetes.io/projected/e80d7bec-3fdb-409b-8d0a-e773911de5eb-kube-api-access-rx2z7\") pod \"kserve-controller-manager-84ffddfb66-bh764\" (UID: \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\") " pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:45:36.703159 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.703072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkt4l\" (UniqueName: \"kubernetes.io/projected/c7a40edd-751b-4af7-826c-4f2e8845c135-kube-api-access-kkt4l\") pod \"llmisvc-controller-manager-6954c7fbdf-gcrkb\" (UID: \"c7a40edd-751b-4af7-826c-4f2e8845c135\") " pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:45:36.703159 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.703106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7a40edd-751b-4af7-826c-4f2e8845c135-cert\") pod \"llmisvc-controller-manager-6954c7fbdf-gcrkb\" (UID: \"c7a40edd-751b-4af7-826c-4f2e8845c135\") " pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:45:36.703159 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.703136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e80d7bec-3fdb-409b-8d0a-e773911de5eb-cert\") pod \"kserve-controller-manager-84ffddfb66-bh764\" (UID: \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\") " pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:45:36.703336 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:45:36.703239 2578 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 17:45:36.703336 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:45:36.703249 2578 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 17:45:36.703336 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:45:36.703310 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7a40edd-751b-4af7-826c-4f2e8845c135-cert podName:c7a40edd-751b-4af7-826c-4f2e8845c135 nodeName:}" failed. No retries permitted until 2026-04-22 17:45:37.203289048 +0000 UTC m=+685.819381297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c7a40edd-751b-4af7-826c-4f2e8845c135-cert") pod "llmisvc-controller-manager-6954c7fbdf-gcrkb" (UID: "c7a40edd-751b-4af7-826c-4f2e8845c135") : secret "llmisvc-webhook-server-cert" not found Apr 22 17:45:36.703336 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:45:36.703332 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e80d7bec-3fdb-409b-8d0a-e773911de5eb-cert podName:e80d7bec-3fdb-409b-8d0a-e773911de5eb nodeName:}" failed. No retries permitted until 2026-04-22 17:45:37.203321129 +0000 UTC m=+685.819413381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e80d7bec-3fdb-409b-8d0a-e773911de5eb-cert") pod "kserve-controller-manager-84ffddfb66-bh764" (UID: "e80d7bec-3fdb-409b-8d0a-e773911de5eb") : secret "kserve-webhook-server-cert" not found Apr 22 17:45:36.713868 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.713842 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkt4l\" (UniqueName: \"kubernetes.io/projected/c7a40edd-751b-4af7-826c-4f2e8845c135-kube-api-access-kkt4l\") pod \"llmisvc-controller-manager-6954c7fbdf-gcrkb\" (UID: \"c7a40edd-751b-4af7-826c-4f2e8845c135\") " pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:45:36.714286 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:36.714266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx2z7\" (UniqueName: \"kubernetes.io/projected/e80d7bec-3fdb-409b-8d0a-e773911de5eb-kube-api-access-rx2z7\") pod \"kserve-controller-manager-84ffddfb66-bh764\" (UID: \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\") " pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:45:37.206675 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:37.206651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7a40edd-751b-4af7-826c-4f2e8845c135-cert\") pod \"llmisvc-controller-manager-6954c7fbdf-gcrkb\" (UID: \"c7a40edd-751b-4af7-826c-4f2e8845c135\") " pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:45:37.206827 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:37.206683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e80d7bec-3fdb-409b-8d0a-e773911de5eb-cert\") pod \"kserve-controller-manager-84ffddfb66-bh764\" (UID: \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\") " pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:45:37.208763 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:37.208736 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e80d7bec-3fdb-409b-8d0a-e773911de5eb-cert\") pod \"kserve-controller-manager-84ffddfb66-bh764\" (UID: \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\") " pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:45:37.208860 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:37.208832 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7a40edd-751b-4af7-826c-4f2e8845c135-cert\") pod \"llmisvc-controller-manager-6954c7fbdf-gcrkb\" (UID: \"c7a40edd-751b-4af7-826c-4f2e8845c135\") " pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:45:37.441540 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:37.441513 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:45:37.466352 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:37.466283 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:45:37.573943 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:37.573908 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-bh764"] Apr 22 17:45:37.579185 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:45:37.579147 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode80d7bec_3fdb_409b_8d0a_e773911de5eb.slice/crio-14e9bcae6764f766505073a7079d1b0508bf476ed6fc1fedaf1a26944148e10d WatchSource:0}: Error finding container 14e9bcae6764f766505073a7079d1b0508bf476ed6fc1fedaf1a26944148e10d: Status 404 returned error can't find the container with id 14e9bcae6764f766505073a7079d1b0508bf476ed6fc1fedaf1a26944148e10d Apr 22 17:45:37.601155 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:37.601132 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb"] Apr 22 17:45:37.603018 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:45:37.602991 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc7a40edd_751b_4af7_826c_4f2e8845c135.slice/crio-0c605892995bfeee8f96a5a0c3a4e22da400c9b4e4d937ed6f72257489460efc WatchSource:0}: Error finding container 0c605892995bfeee8f96a5a0c3a4e22da400c9b4e4d937ed6f72257489460efc: Status 404 returned error can't find the container with id 0c605892995bfeee8f96a5a0c3a4e22da400c9b4e4d937ed6f72257489460efc Apr 22 17:45:38.091349 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:38.091319 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" event={"ID":"c7a40edd-751b-4af7-826c-4f2e8845c135","Type":"ContainerStarted","Data":"0c605892995bfeee8f96a5a0c3a4e22da400c9b4e4d937ed6f72257489460efc"} Apr 22 17:45:38.092282 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:38.092262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" event={"ID":"e80d7bec-3fdb-409b-8d0a-e773911de5eb","Type":"ContainerStarted","Data":"14e9bcae6764f766505073a7079d1b0508bf476ed6fc1fedaf1a26944148e10d"} Apr 22 17:45:41.110315 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:41.110286 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" event={"ID":"e80d7bec-3fdb-409b-8d0a-e773911de5eb","Type":"ContainerStarted","Data":"0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72"} Apr 22 17:45:41.110668 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:41.110393 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:45:41.127909 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:41.127696 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" podStartSLOduration=1.733733588 podStartE2EDuration="5.127679647s" podCreationTimestamp="2026-04-22 17:45:36 +0000 UTC" firstStartedPulling="2026-04-22 17:45:37.580632408 +0000 UTC m=+686.196724672" lastFinishedPulling="2026-04-22 17:45:40.974578478 +0000 UTC m=+689.590670731" observedRunningTime="2026-04-22 17:45:41.127607286 +0000 UTC m=+689.743699556" watchObservedRunningTime="2026-04-22 17:45:41.127679647 +0000 UTC m=+689.743771919" Apr 22 17:45:42.114492 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:42.114456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" event={"ID":"c7a40edd-751b-4af7-826c-4f2e8845c135","Type":"ContainerStarted","Data":"a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4"} Apr 22 17:45:42.114858 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:42.114544 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:45:42.133447 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:45:42.133374 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" podStartSLOduration=2.709069927 podStartE2EDuration="6.133360337s" podCreationTimestamp="2026-04-22 17:45:36 +0000 UTC" firstStartedPulling="2026-04-22 17:45:37.604320625 +0000 UTC m=+686.220412874" lastFinishedPulling="2026-04-22 17:45:41.028611021 +0000 UTC m=+689.644703284" observedRunningTime="2026-04-22 17:45:42.132111811 +0000 UTC m=+690.748204098" watchObservedRunningTime="2026-04-22 17:45:42.133360337 +0000 UTC m=+690.749452611" Apr 22 17:46:12.119836 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:12.119805 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:46:13.120971 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:13.120939 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:46:14.288120 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.288080 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-bh764"] Apr 22 17:46:14.288574 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.288286 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" podUID="e80d7bec-3fdb-409b-8d0a-e773911de5eb" containerName="manager" containerID="cri-o://0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72" gracePeriod=10 Apr 22 17:46:14.307555 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.307524 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-gj9hs"] Apr 22 17:46:14.311715 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.311694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" Apr 22 17:46:14.320644 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.320608 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-gj9hs"] Apr 22 17:46:14.416822 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.416791 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/990984ef-49ee-48d9-b76d-90212467c4aa-cert\") pod \"kserve-controller-manager-84ffddfb66-gj9hs\" (UID: \"990984ef-49ee-48d9-b76d-90212467c4aa\") " pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" Apr 22 17:46:14.416957 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.416879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzmv\" (UniqueName: \"kubernetes.io/projected/990984ef-49ee-48d9-b76d-90212467c4aa-kube-api-access-6mzmv\") pod \"kserve-controller-manager-84ffddfb66-gj9hs\" (UID: \"990984ef-49ee-48d9-b76d-90212467c4aa\") " pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" Apr 22 17:46:14.518121 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.518093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/990984ef-49ee-48d9-b76d-90212467c4aa-cert\") pod \"kserve-controller-manager-84ffddfb66-gj9hs\" (UID: \"990984ef-49ee-48d9-b76d-90212467c4aa\") " pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" Apr 22 17:46:14.518234 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.518184 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzmv\" (UniqueName: \"kubernetes.io/projected/990984ef-49ee-48d9-b76d-90212467c4aa-kube-api-access-6mzmv\") pod \"kserve-controller-manager-84ffddfb66-gj9hs\" (UID: \"990984ef-49ee-48d9-b76d-90212467c4aa\") " pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" Apr 22 17:46:14.520718 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.520688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/990984ef-49ee-48d9-b76d-90212467c4aa-cert\") pod \"kserve-controller-manager-84ffddfb66-gj9hs\" (UID: \"990984ef-49ee-48d9-b76d-90212467c4aa\") " pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" Apr 22 17:46:14.527172 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.527145 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzmv\" (UniqueName: \"kubernetes.io/projected/990984ef-49ee-48d9-b76d-90212467c4aa-kube-api-access-6mzmv\") pod \"kserve-controller-manager-84ffddfb66-gj9hs\" (UID: \"990984ef-49ee-48d9-b76d-90212467c4aa\") " pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" Apr 22 17:46:14.542929 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.542860 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:46:14.664861 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.664831 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" Apr 22 17:46:14.719786 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.719756 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e80d7bec-3fdb-409b-8d0a-e773911de5eb-cert\") pod \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\" (UID: \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\") " Apr 22 17:46:14.719948 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.719859 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx2z7\" (UniqueName: \"kubernetes.io/projected/e80d7bec-3fdb-409b-8d0a-e773911de5eb-kube-api-access-rx2z7\") pod \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\" (UID: \"e80d7bec-3fdb-409b-8d0a-e773911de5eb\") " Apr 22 17:46:14.722134 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.722101 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80d7bec-3fdb-409b-8d0a-e773911de5eb-cert" (OuterVolumeSpecName: "cert") pod "e80d7bec-3fdb-409b-8d0a-e773911de5eb" (UID: "e80d7bec-3fdb-409b-8d0a-e773911de5eb"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:46:14.722762 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.722727 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e80d7bec-3fdb-409b-8d0a-e773911de5eb-kube-api-access-rx2z7" (OuterVolumeSpecName: "kube-api-access-rx2z7") pod "e80d7bec-3fdb-409b-8d0a-e773911de5eb" (UID: "e80d7bec-3fdb-409b-8d0a-e773911de5eb"). InnerVolumeSpecName "kube-api-access-rx2z7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:46:14.783534 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.783509 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-gj9hs"] Apr 22 17:46:14.787099 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:46:14.787064 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod990984ef_49ee_48d9_b76d_90212467c4aa.slice/crio-fdb5ce9e0c830929225ac6e69ae39b76055c2e0c89b2c25e9075965153f9d9b6 WatchSource:0}: Error finding container fdb5ce9e0c830929225ac6e69ae39b76055c2e0c89b2c25e9075965153f9d9b6: Status 404 returned error can't find the container with id fdb5ce9e0c830929225ac6e69ae39b76055c2e0c89b2c25e9075965153f9d9b6 Apr 22 17:46:14.820761 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.820696 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rx2z7\" (UniqueName: \"kubernetes.io/projected/e80d7bec-3fdb-409b-8d0a-e773911de5eb-kube-api-access-rx2z7\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:46:14.820761 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:14.820732 2578 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e80d7bec-3fdb-409b-8d0a-e773911de5eb-cert\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:46:15.220133 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.220095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" event={"ID":"990984ef-49ee-48d9-b76d-90212467c4aa","Type":"ContainerStarted","Data":"9eaa2796a9e1cf37e1dbac46ed31afe0414f012a0b840caed8865f77c258d34d"} Apr 22 17:46:15.220133 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.220133 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" event={"ID":"990984ef-49ee-48d9-b76d-90212467c4aa","Type":"ContainerStarted","Data":"fdb5ce9e0c830929225ac6e69ae39b76055c2e0c89b2c25e9075965153f9d9b6"} Apr 22 17:46:15.220413 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.220224 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" Apr 22 17:46:15.221178 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.221156 2578 generic.go:358] "Generic (PLEG): container finished" podID="e80d7bec-3fdb-409b-8d0a-e773911de5eb" containerID="0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72" exitCode=0 Apr 22 17:46:15.221267 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.221183 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" event={"ID":"e80d7bec-3fdb-409b-8d0a-e773911de5eb","Type":"ContainerDied","Data":"0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72"} Apr 22 17:46:15.221267 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.221217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" event={"ID":"e80d7bec-3fdb-409b-8d0a-e773911de5eb","Type":"ContainerDied","Data":"14e9bcae6764f766505073a7079d1b0508bf476ed6fc1fedaf1a26944148e10d"} Apr 22 17:46:15.221267 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.221219 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-bh764" Apr 22 17:46:15.221267 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.221231 2578 scope.go:117] "RemoveContainer" containerID="0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72" Apr 22 17:46:15.230257 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.230237 2578 scope.go:117] "RemoveContainer" containerID="0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72" Apr 22 17:46:15.230564 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:46:15.230540 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72\": container with ID starting with 0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72 not found: ID does not exist" containerID="0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72" Apr 22 17:46:15.230650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.230570 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72"} err="failed to get container status \"0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72\": rpc error: code = NotFound desc = could not find container \"0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72\": container with ID starting with 0a4122f4ed3ce32bdd4385603d1749d406802c7b37d10dbbe889791f58b3bf72 not found: ID does not exist" Apr 22 17:46:15.236179 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.236141 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" podStartSLOduration=0.895620948 podStartE2EDuration="1.236129514s" podCreationTimestamp="2026-04-22 17:46:14 +0000 UTC" firstStartedPulling="2026-04-22 17:46:14.788507164 +0000 UTC m=+723.404599418" lastFinishedPulling="2026-04-22 17:46:15.129015729 +0000 UTC m=+723.745107984" observedRunningTime="2026-04-22 17:46:15.235049137 +0000 UTC m=+723.851141409" watchObservedRunningTime="2026-04-22 17:46:15.236129514 +0000 UTC m=+723.852221786" Apr 22 17:46:15.249367 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.249344 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-bh764"] Apr 22 17:46:15.250929 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.250910 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-bh764"] Apr 22 17:46:15.955629 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:15.955589 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e80d7bec-3fdb-409b-8d0a-e773911de5eb" path="/var/lib/kubelet/pods/e80d7bec-3fdb-409b-8d0a-e773911de5eb/volumes" Apr 22 17:46:46.230768 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:46.230738 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84ffddfb66-gj9hs" Apr 22 17:46:47.111276 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.111246 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-2wgbj"] Apr 22 17:46:47.111626 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.111610 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e80d7bec-3fdb-409b-8d0a-e773911de5eb" containerName="manager" Apr 22 17:46:47.111703 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.111629 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80d7bec-3fdb-409b-8d0a-e773911de5eb" containerName="manager" Apr 22 17:46:47.111747 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.111708 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e80d7bec-3fdb-409b-8d0a-e773911de5eb" containerName="manager" Apr 22 17:46:47.115204 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.115183 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2wgbj" Apr 22 17:46:47.118010 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.117989 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 17:46:47.118244 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.118227 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-dhbbt\"" Apr 22 17:46:47.126282 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.126259 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2wgbj"] Apr 22 17:46:47.200486 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.200454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxcn\" (UniqueName: \"kubernetes.io/projected/a8db03a9-d980-46d9-a984-6c51d78956f2-kube-api-access-xhxcn\") pod \"model-serving-api-86f7b4b499-2wgbj\" (UID: \"a8db03a9-d980-46d9-a984-6c51d78956f2\") " pod="kserve/model-serving-api-86f7b4b499-2wgbj" Apr 22 17:46:47.200679 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.200556 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db03a9-d980-46d9-a984-6c51d78956f2-tls-certs\") pod \"model-serving-api-86f7b4b499-2wgbj\" (UID: \"a8db03a9-d980-46d9-a984-6c51d78956f2\") " pod="kserve/model-serving-api-86f7b4b499-2wgbj" Apr 22 17:46:47.301613 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.301582 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhxcn\" (UniqueName: \"kubernetes.io/projected/a8db03a9-d980-46d9-a984-6c51d78956f2-kube-api-access-xhxcn\") pod \"model-serving-api-86f7b4b499-2wgbj\" (UID: \"a8db03a9-d980-46d9-a984-6c51d78956f2\") " pod="kserve/model-serving-api-86f7b4b499-2wgbj" Apr 22 17:46:47.302044 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.301641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db03a9-d980-46d9-a984-6c51d78956f2-tls-certs\") pod \"model-serving-api-86f7b4b499-2wgbj\" (UID: \"a8db03a9-d980-46d9-a984-6c51d78956f2\") " pod="kserve/model-serving-api-86f7b4b499-2wgbj" Apr 22 17:46:47.304008 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.303975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db03a9-d980-46d9-a984-6c51d78956f2-tls-certs\") pod \"model-serving-api-86f7b4b499-2wgbj\" (UID: \"a8db03a9-d980-46d9-a984-6c51d78956f2\") " pod="kserve/model-serving-api-86f7b4b499-2wgbj" Apr 22 17:46:47.312864 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.312837 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhxcn\" (UniqueName: \"kubernetes.io/projected/a8db03a9-d980-46d9-a984-6c51d78956f2-kube-api-access-xhxcn\") pod \"model-serving-api-86f7b4b499-2wgbj\" (UID: \"a8db03a9-d980-46d9-a984-6c51d78956f2\") " pod="kserve/model-serving-api-86f7b4b499-2wgbj" Apr 22 17:46:47.429689 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.429576 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2wgbj" Apr 22 17:46:47.790338 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:47.790304 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2wgbj"] Apr 22 17:46:47.792989 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:46:47.792962 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8db03a9_d980_46d9_a984_6c51d78956f2.slice/crio-fd5fe9b068e15b14493fb8c6211947d6e80a6be17baae30b5e92e3b84684ca39 WatchSource:0}: Error finding container fd5fe9b068e15b14493fb8c6211947d6e80a6be17baae30b5e92e3b84684ca39: Status 404 returned error can't find the container with id fd5fe9b068e15b14493fb8c6211947d6e80a6be17baae30b5e92e3b84684ca39 Apr 22 17:46:48.327873 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:48.327832 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2wgbj" event={"ID":"a8db03a9-d980-46d9-a984-6c51d78956f2","Type":"ContainerStarted","Data":"fd5fe9b068e15b14493fb8c6211947d6e80a6be17baae30b5e92e3b84684ca39"} Apr 22 17:46:50.337621 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:50.337537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2wgbj" event={"ID":"a8db03a9-d980-46d9-a984-6c51d78956f2","Type":"ContainerStarted","Data":"949a939a787ae917e867b10df20760610be094f99116423c01e9a7a5d56dfdb5"} Apr 22 17:46:50.337996 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:50.337776 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-2wgbj" Apr 22 17:46:50.354891 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:46:50.354847 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-2wgbj" podStartSLOduration=1.0972054 podStartE2EDuration="3.354834554s" podCreationTimestamp="2026-04-22 17:46:47 +0000 UTC" firstStartedPulling="2026-04-22 17:46:47.794701602 +0000 UTC m=+756.410793852" lastFinishedPulling="2026-04-22 17:46:50.052330752 +0000 UTC m=+758.668423006" observedRunningTime="2026-04-22 17:46:50.353106329 +0000 UTC m=+758.969198600" watchObservedRunningTime="2026-04-22 17:46:50.354834554 +0000 UTC m=+758.970926825" Apr 22 17:47:01.346802 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:47:01.346773 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-2wgbj" Apr 22 17:48:00.373550 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.373509 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5"] Apr 22 17:48:00.377249 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.377227 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.379879 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.379850 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 17:48:00.380010 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.379989 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 17:48:00.380989 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.380971 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:48:00.381113 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.381019 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 22 17:48:00.387360 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.387340 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5"] Apr 22 17:48:00.514220 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.514185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.514384 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.514244 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.514384 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.514316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.514384 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.514348 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.514384 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.514367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f73d3-d01e-4526-90db-c519745c3be9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.514611 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.514486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s5l9\" (UniqueName: \"kubernetes.io/projected/0c7f73d3-d01e-4526-90db-c519745c3be9-kube-api-access-4s5l9\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.615894 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.615864 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.616066 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.615913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.616066 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.616027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.616214 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.616065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.616214 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.616192 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f73d3-d01e-4526-90db-c519745c3be9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.616314 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.616228 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.616314 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.616257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4s5l9\" (UniqueName: \"kubernetes.io/projected/0c7f73d3-d01e-4526-90db-c519745c3be9-kube-api-access-4s5l9\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.616314 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.616281 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.616492 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.616329 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.618226 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.618204 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.618657 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.618637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f73d3-d01e-4526-90db-c519745c3be9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.623853 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.623799 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s5l9\" (UniqueName: \"kubernetes.io/projected/0c7f73d3-d01e-4526-90db-c519745c3be9-kube-api-access-4s5l9\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.688599 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.688560 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:00.826728 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:00.826691 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5"] Apr 22 17:48:00.830596 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:48:00.830564 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c7f73d3_d01e_4526_90db_c519745c3be9.slice/crio-2101723144b779d5b853b7dc25413d85d6d76a10ad8432f1ee5093cc48901b7f WatchSource:0}: Error finding container 2101723144b779d5b853b7dc25413d85d6d76a10ad8432f1ee5093cc48901b7f: Status 404 returned error can't find the container with id 2101723144b779d5b853b7dc25413d85d6d76a10ad8432f1ee5093cc48901b7f Apr 22 17:48:01.581470 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:01.581435 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" event={"ID":"0c7f73d3-d01e-4526-90db-c519745c3be9","Type":"ContainerStarted","Data":"2101723144b779d5b853b7dc25413d85d6d76a10ad8432f1ee5093cc48901b7f"} Apr 22 17:48:04.593557 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:04.593478 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" event={"ID":"0c7f73d3-d01e-4526-90db-c519745c3be9","Type":"ContainerStarted","Data":"8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690"} Apr 22 17:48:08.610332 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:08.610299 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c7f73d3-d01e-4526-90db-c519745c3be9" containerID="8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690" exitCode=0 Apr 22 17:48:08.610706 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:08.610358 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" event={"ID":"0c7f73d3-d01e-4526-90db-c519745c3be9","Type":"ContainerDied","Data":"8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690"} Apr 22 17:48:10.619298 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:10.619265 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" event={"ID":"0c7f73d3-d01e-4526-90db-c519745c3be9","Type":"ContainerStarted","Data":"81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557"} Apr 22 17:48:10.645281 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:10.645229 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" podStartSLOduration=1.81517016 podStartE2EDuration="10.645213248s" podCreationTimestamp="2026-04-22 17:48:00 +0000 UTC" firstStartedPulling="2026-04-22 17:48:00.832784034 +0000 UTC m=+829.448876284" lastFinishedPulling="2026-04-22 17:48:09.662827122 +0000 UTC m=+838.278919372" observedRunningTime="2026-04-22 17:48:10.643306582 +0000 UTC m=+839.259398855" watchObservedRunningTime="2026-04-22 17:48:10.645213248 +0000 UTC m=+839.261305520" Apr 22 17:48:10.689551 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:10.689512 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:10.689551 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:10.689552 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:10.702134 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:10.702104 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:11.633635 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:11.633606 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:48:36.018615 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.018544 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp"] Apr 22 17:48:36.030593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.030551 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.035167 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.035143 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 17:48:36.036062 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.036041 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-dlhwp\"" Apr 22 17:48:36.040596 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.040576 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp"] Apr 22 17:48:36.125281 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.125255 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.125464 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.125315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01d7abfd-d793-40f0-92b2-00057fa8d815-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.125464 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.125387 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.125584 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.125482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.125584 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.125509 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl58s\" (UniqueName: \"kubernetes.io/projected/01d7abfd-d793-40f0-92b2-00057fa8d815-kube-api-access-vl58s\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.125584 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.125535 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.226342 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.226311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.226342 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.226349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl58s\" (UniqueName: \"kubernetes.io/projected/01d7abfd-d793-40f0-92b2-00057fa8d815-kube-api-access-vl58s\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.226583 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.226375 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.226583 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.226444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.226583 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.226512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01d7abfd-d793-40f0-92b2-00057fa8d815-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.226754 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.226583 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.226816 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.226771 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.226869 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.226837 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.227019 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.226988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.227019 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.227005 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.229356 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.229330 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01d7abfd-d793-40f0-92b2-00057fa8d815-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.234880 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.234858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl58s\" (UniqueName: \"kubernetes.io/projected/01d7abfd-d793-40f0-92b2-00057fa8d815-kube-api-access-vl58s\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.342195 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.342117 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:48:36.671350 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.671139 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp"] Apr 22 17:48:36.674194 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:48:36.674164 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d7abfd_d793_40f0_92b2_00057fa8d815.slice/crio-b7bf5128b0ccf0e9bb2fd0b443c026681da3dbfd0e8ec82f1c751a4998beb9fd WatchSource:0}: Error finding container b7bf5128b0ccf0e9bb2fd0b443c026681da3dbfd0e8ec82f1c751a4998beb9fd: Status 404 returned error can't find the container with id b7bf5128b0ccf0e9bb2fd0b443c026681da3dbfd0e8ec82f1c751a4998beb9fd Apr 22 17:48:36.676215 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.676197 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:48:36.710487 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:36.710457 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" event={"ID":"01d7abfd-d793-40f0-92b2-00057fa8d815","Type":"ContainerStarted","Data":"b7bf5128b0ccf0e9bb2fd0b443c026681da3dbfd0e8ec82f1c751a4998beb9fd"} Apr 22 17:48:37.715120 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:37.715029 2578 generic.go:358] "Generic (PLEG): container finished" podID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerID="dd23fb2af19be30ef32ccccc4c6b389ea97dfe9616d56c56090f3a059dad009e" exitCode=0 Apr 22 17:48:37.715120 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:37.715092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" event={"ID":"01d7abfd-d793-40f0-92b2-00057fa8d815","Type":"ContainerDied","Data":"dd23fb2af19be30ef32ccccc4c6b389ea97dfe9616d56c56090f3a059dad009e"} Apr 22 17:48:38.720624 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:48:38.720592 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" event={"ID":"01d7abfd-d793-40f0-92b2-00057fa8d815","Type":"ContainerStarted","Data":"71229a37af3c43b46c0a39f41a0359ae161a3f0ecc6bce17baf561946dd61d56"} Apr 22 17:49:08.846395 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:08.846359 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" event={"ID":"01d7abfd-d793-40f0-92b2-00057fa8d815","Type":"ContainerStarted","Data":"c408c303dd1b4c9328eea64680bb69e3488643d675d38828033eb68eb3ed5b23"} Apr 22 17:49:08.846868 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:08.846658 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:49:08.848854 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:08.848827 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 17:49:08.897475 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:08.897433 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" podStartSLOduration=3.834285461 podStartE2EDuration="33.897408871s" podCreationTimestamp="2026-04-22 17:48:35 +0000 UTC" firstStartedPulling="2026-04-22 17:48:37.716257728 +0000 UTC m=+866.332349977" lastFinishedPulling="2026-04-22 17:49:07.779381136 +0000 UTC m=+896.395473387" observedRunningTime="2026-04-22 17:49:08.896234055 +0000 UTC m=+897.512326328" watchObservedRunningTime="2026-04-22 17:49:08.897408871 +0000 UTC m=+897.513501143" Apr 22 17:49:09.851953 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:09.851914 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 17:49:12.581623 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:12.581595 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:49:12.582065 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:12.581934 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:49:16.342791 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:16.342756 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:49:16.342791 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:16.342787 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:49:16.344358 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:16.344335 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:49:16.344483 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:16.344398 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 17:49:16.877728 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:16.877691 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 17:49:16.877888 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:16.877814 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:49:17.882335 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:17.882303 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 17:49:23.811579 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:23.811541 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp"] Apr 22 17:49:23.812129 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:23.811949 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="main" containerID="cri-o://71229a37af3c43b46c0a39f41a0359ae161a3f0ecc6bce17baf561946dd61d56" gracePeriod=30 Apr 22 17:49:23.812129 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:23.812060 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="tokenizer" containerID="cri-o://c408c303dd1b4c9328eea64680bb69e3488643d675d38828033eb68eb3ed5b23" gracePeriod=30 Apr 22 17:49:23.813398 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:23.813370 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 17:49:24.907572 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.907507 2578 generic.go:358] "Generic (PLEG): container finished" podID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerID="c408c303dd1b4c9328eea64680bb69e3488643d675d38828033eb68eb3ed5b23" exitCode=0 Apr 22 17:49:24.907572 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.907531 2578 generic.go:358] "Generic (PLEG): container finished" podID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerID="71229a37af3c43b46c0a39f41a0359ae161a3f0ecc6bce17baf561946dd61d56" exitCode=0 Apr 22 17:49:24.907572 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.907554 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" event={"ID":"01d7abfd-d793-40f0-92b2-00057fa8d815","Type":"ContainerDied","Data":"c408c303dd1b4c9328eea64680bb69e3488643d675d38828033eb68eb3ed5b23"} Apr 22 17:49:24.907955 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.907592 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" event={"ID":"01d7abfd-d793-40f0-92b2-00057fa8d815","Type":"ContainerDied","Data":"71229a37af3c43b46c0a39f41a0359ae161a3f0ecc6bce17baf561946dd61d56"} Apr 22 17:49:24.957264 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.957240 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:49:24.983241 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.983220 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-cache\") pod \"01d7abfd-d793-40f0-92b2-00057fa8d815\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " Apr 22 17:49:24.983331 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.983257 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01d7abfd-d793-40f0-92b2-00057fa8d815-tls-certs\") pod \"01d7abfd-d793-40f0-92b2-00057fa8d815\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " Apr 22 17:49:24.983331 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.983313 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl58s\" (UniqueName: \"kubernetes.io/projected/01d7abfd-d793-40f0-92b2-00057fa8d815-kube-api-access-vl58s\") pod \"01d7abfd-d793-40f0-92b2-00057fa8d815\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " Apr 22 17:49:24.983447 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.983339 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-uds\") pod \"01d7abfd-d793-40f0-92b2-00057fa8d815\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " Apr 22 17:49:24.983447 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.983380 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-kserve-provision-location\") pod \"01d7abfd-d793-40f0-92b2-00057fa8d815\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " Apr 22 17:49:24.983566 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.983445 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-tmp\") pod \"01d7abfd-d793-40f0-92b2-00057fa8d815\" (UID: \"01d7abfd-d793-40f0-92b2-00057fa8d815\") " Apr 22 17:49:24.983566 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.983493 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "01d7abfd-d793-40f0-92b2-00057fa8d815" (UID: "01d7abfd-d793-40f0-92b2-00057fa8d815"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:24.983671 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.983616 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "01d7abfd-d793-40f0-92b2-00057fa8d815" (UID: "01d7abfd-d793-40f0-92b2-00057fa8d815"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:24.983830 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.983770 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-cache\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:49:24.983830 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.983796 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-uds\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:49:24.984054 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.984026 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "01d7abfd-d793-40f0-92b2-00057fa8d815" (UID: "01d7abfd-d793-40f0-92b2-00057fa8d815"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:24.984481 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.984444 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "01d7abfd-d793-40f0-92b2-00057fa8d815" (UID: "01d7abfd-d793-40f0-92b2-00057fa8d815"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:24.986358 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.986325 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d7abfd-d793-40f0-92b2-00057fa8d815-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "01d7abfd-d793-40f0-92b2-00057fa8d815" (UID: "01d7abfd-d793-40f0-92b2-00057fa8d815"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:49:24.986548 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:24.986372 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d7abfd-d793-40f0-92b2-00057fa8d815-kube-api-access-vl58s" (OuterVolumeSpecName: "kube-api-access-vl58s") pod "01d7abfd-d793-40f0-92b2-00057fa8d815" (UID: "01d7abfd-d793-40f0-92b2-00057fa8d815"). InnerVolumeSpecName "kube-api-access-vl58s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:49:25.084949 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.084873 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-kserve-provision-location\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:49:25.084949 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.084904 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/01d7abfd-d793-40f0-92b2-00057fa8d815-tokenizer-tmp\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:49:25.084949 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.084914 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01d7abfd-d793-40f0-92b2-00057fa8d815-tls-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:49:25.084949 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.084924 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vl58s\" (UniqueName: \"kubernetes.io/projected/01d7abfd-d793-40f0-92b2-00057fa8d815-kube-api-access-vl58s\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:49:25.912130 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.912100 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" Apr 22 17:49:25.912581 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.912100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp" event={"ID":"01d7abfd-d793-40f0-92b2-00057fa8d815","Type":"ContainerDied","Data":"b7bf5128b0ccf0e9bb2fd0b443c026681da3dbfd0e8ec82f1c751a4998beb9fd"} Apr 22 17:49:25.912581 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.912215 2578 scope.go:117] "RemoveContainer" containerID="c408c303dd1b4c9328eea64680bb69e3488643d675d38828033eb68eb3ed5b23" Apr 22 17:49:25.921696 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.921678 2578 scope.go:117] "RemoveContainer" containerID="71229a37af3c43b46c0a39f41a0359ae161a3f0ecc6bce17baf561946dd61d56" Apr 22 17:49:25.929178 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.929162 2578 scope.go:117] "RemoveContainer" containerID="dd23fb2af19be30ef32ccccc4c6b389ea97dfe9616d56c56090f3a059dad009e" Apr 22 17:49:25.937945 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.937927 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp"] Apr 22 17:49:25.941706 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.941672 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646d9pnp"] Apr 22 17:49:25.955371 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:49:25.955352 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" path="/var/lib/kubelet/pods/01d7abfd-d793-40f0-92b2-00057fa8d815/volumes" Apr 22 17:52:32.983936 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:32.983902 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5"] Apr 22 17:52:32.984512 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:32.984261 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" podUID="0c7f73d3-d01e-4526-90db-c519745c3be9" containerName="main" containerID="cri-o://81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557" gracePeriod=30 Apr 22 17:52:33.232528 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.232503 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:52:33.353931 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.353853 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s5l9\" (UniqueName: \"kubernetes.io/projected/0c7f73d3-d01e-4526-90db-c519745c3be9-kube-api-access-4s5l9\") pod \"0c7f73d3-d01e-4526-90db-c519745c3be9\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " Apr 22 17:52:33.353931 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.353893 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f73d3-d01e-4526-90db-c519745c3be9-tls-certs\") pod \"0c7f73d3-d01e-4526-90db-c519745c3be9\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " Apr 22 17:52:33.354142 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.353952 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-kserve-provision-location\") pod \"0c7f73d3-d01e-4526-90db-c519745c3be9\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " Apr 22 17:52:33.354142 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.353973 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-model-cache\") pod \"0c7f73d3-d01e-4526-90db-c519745c3be9\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " Apr 22 17:52:33.354142 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.353988 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-dshm\") pod \"0c7f73d3-d01e-4526-90db-c519745c3be9\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " Apr 22 17:52:33.354142 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.354035 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-home\") pod \"0c7f73d3-d01e-4526-90db-c519745c3be9\" (UID: \"0c7f73d3-d01e-4526-90db-c519745c3be9\") " Apr 22 17:52:33.354341 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.354259 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-model-cache" (OuterVolumeSpecName: "model-cache") pod "0c7f73d3-d01e-4526-90db-c519745c3be9" (UID: "0c7f73d3-d01e-4526-90db-c519745c3be9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:52:33.354394 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.354362 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-home" (OuterVolumeSpecName: "home") pod "0c7f73d3-d01e-4526-90db-c519745c3be9" (UID: "0c7f73d3-d01e-4526-90db-c519745c3be9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:52:33.356069 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.356046 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7f73d3-d01e-4526-90db-c519745c3be9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0c7f73d3-d01e-4526-90db-c519745c3be9" (UID: "0c7f73d3-d01e-4526-90db-c519745c3be9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:52:33.356190 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.356066 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-dshm" (OuterVolumeSpecName: "dshm") pod "0c7f73d3-d01e-4526-90db-c519745c3be9" (UID: "0c7f73d3-d01e-4526-90db-c519745c3be9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:52:33.356190 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.356170 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7f73d3-d01e-4526-90db-c519745c3be9-kube-api-access-4s5l9" (OuterVolumeSpecName: "kube-api-access-4s5l9") pod "0c7f73d3-d01e-4526-90db-c519745c3be9" (UID: "0c7f73d3-d01e-4526-90db-c519745c3be9"). InnerVolumeSpecName "kube-api-access-4s5l9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:52:33.407593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.407564 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0c7f73d3-d01e-4526-90db-c519745c3be9" (UID: "0c7f73d3-d01e-4526-90db-c519745c3be9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:52:33.455466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.455412 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-kserve-provision-location\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:52:33.455466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.455462 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-model-cache\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:52:33.455466 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.455472 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-dshm\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:52:33.455664 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.455480 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0c7f73d3-d01e-4526-90db-c519745c3be9-home\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:52:33.455664 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.455489 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4s5l9\" (UniqueName: \"kubernetes.io/projected/0c7f73d3-d01e-4526-90db-c519745c3be9-kube-api-access-4s5l9\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:52:33.455664 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.455497 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7f73d3-d01e-4526-90db-c519745c3be9-tls-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:52:33.545470 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.545412 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c7f73d3-d01e-4526-90db-c519745c3be9" containerID="81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557" exitCode=0 Apr 22 17:52:33.545621 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.545492 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" event={"ID":"0c7f73d3-d01e-4526-90db-c519745c3be9","Type":"ContainerDied","Data":"81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557"} Apr 22 17:52:33.545621 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.545523 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" Apr 22 17:52:33.545621 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.545540 2578 scope.go:117] "RemoveContainer" containerID="81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557" Apr 22 17:52:33.545766 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.545528 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5" event={"ID":"0c7f73d3-d01e-4526-90db-c519745c3be9","Type":"ContainerDied","Data":"2101723144b779d5b853b7dc25413d85d6d76a10ad8432f1ee5093cc48901b7f"} Apr 22 17:52:33.554012 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.553991 2578 scope.go:117] "RemoveContainer" containerID="8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690" Apr 22 17:52:33.570900 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.570467 2578 scope.go:117] "RemoveContainer" containerID="81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557" Apr 22 17:52:33.571217 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:52:33.571065 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557\": container with ID starting with 81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557 not found: ID does not exist" containerID="81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557" Apr 22 17:52:33.571217 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.571114 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557"} err="failed to get container status \"81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557\": rpc error: code = NotFound desc = could not find container \"81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557\": container with ID starting with 81d8b46381aebe4e8f212f1e845a3a5a5feb54074bd289dd3e54d487edcc6557 not found: ID does not exist" Apr 22 17:52:33.571217 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.571138 2578 scope.go:117] "RemoveContainer" containerID="8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690" Apr 22 17:52:33.571602 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:52:33.571493 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690\": container with ID starting with 8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690 not found: ID does not exist" containerID="8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690" Apr 22 17:52:33.571602 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.571536 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690"} err="failed to get container status \"8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690\": rpc error: code = NotFound desc = could not find container \"8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690\": container with ID starting with 8b54fc475db90d4fb9a369b3f43ecbda8ac7ae9b41afab09cb1febb73eb38690 not found: ID does not exist" Apr 22 17:52:33.572803 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.572788 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5"] Apr 22 17:52:33.574331 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.574312 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-bc876c984-j2wc5"] Apr 22 17:52:33.955934 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:33.955901 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c7f73d3-d01e-4526-90db-c519745c3be9" path="/var/lib/kubelet/pods/0c7f73d3-d01e-4526-90db-c519745c3be9/volumes" Apr 22 17:52:56.407209 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407179 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j"] Apr 22 17:52:56.407780 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407710 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="tokenizer" Apr 22 17:52:56.407780 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407729 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="tokenizer" Apr 22 17:52:56.407780 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407744 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c7f73d3-d01e-4526-90db-c519745c3be9" containerName="main" Apr 22 17:52:56.407780 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407753 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7f73d3-d01e-4526-90db-c519745c3be9" containerName="main" Apr 22 17:52:56.407780 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407767 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="main" Apr 22 17:52:56.407780 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407775 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="main" Apr 22 17:52:56.408082 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407787 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="storage-initializer" Apr 22 17:52:56.408082 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407796 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="storage-initializer" Apr 22 17:52:56.408082 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407809 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c7f73d3-d01e-4526-90db-c519745c3be9" containerName="storage-initializer" Apr 22 17:52:56.408082 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407817 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7f73d3-d01e-4526-90db-c519745c3be9" containerName="storage-initializer" Apr 22 17:52:56.408082 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407913 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c7f73d3-d01e-4526-90db-c519745c3be9" containerName="main" Apr 22 17:52:56.408082 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407931 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="tokenizer" Apr 22 17:52:56.408082 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.407942 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="01d7abfd-d793-40f0-92b2-00057fa8d815" containerName="main" Apr 22 17:52:56.413143 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.413121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.416064 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.416040 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-7szj2\"" Apr 22 17:52:56.417043 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.417009 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:52:56.417043 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.417038 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 17:52:56.417236 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.417077 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 17:52:56.417236 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.417125 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 17:52:56.422175 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.422157 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j"] Apr 22 17:52:56.450992 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.450968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.451108 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.450998 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfvh2\" (UniqueName: \"kubernetes.io/projected/34665fbc-4e15-4e8a-80f5-055377121541-kube-api-access-xfvh2\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.451108 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.451023 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.451180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.451103 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.451180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.451142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34665fbc-4e15-4e8a-80f5-055377121541-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.451269 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.451227 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.552691 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.552662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.552845 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.552714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.552845 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.552775 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfvh2\" (UniqueName: \"kubernetes.io/projected/34665fbc-4e15-4e8a-80f5-055377121541-kube-api-access-xfvh2\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.552845 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.552814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.553023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.552853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.553023 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.552978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34665fbc-4e15-4e8a-80f5-055377121541-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.553180 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.553151 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.553282 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.553182 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.553282 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.553225 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.553354 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.553305 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.555635 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.555612 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34665fbc-4e15-4e8a-80f5-055377121541-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.564797 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.564778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfvh2\" (UniqueName: \"kubernetes.io/projected/34665fbc-4e15-4e8a-80f5-055377121541-kube-api-access-xfvh2\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.723862 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.723833 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:56.855353 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:56.855325 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j"] Apr 22 17:52:56.857513 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:52:56.857471 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34665fbc_4e15_4e8a_80f5_055377121541.slice/crio-6cdd0f04cf1b0eb5dfdcaa2e78e1cd1e9438b18b0222699728473a79401f5092 WatchSource:0}: Error finding container 6cdd0f04cf1b0eb5dfdcaa2e78e1cd1e9438b18b0222699728473a79401f5092: Status 404 returned error can't find the container with id 6cdd0f04cf1b0eb5dfdcaa2e78e1cd1e9438b18b0222699728473a79401f5092 Apr 22 17:52:57.630265 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:57.630222 2578 generic.go:358] "Generic (PLEG): container finished" podID="34665fbc-4e15-4e8a-80f5-055377121541" containerID="1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a" exitCode=0 Apr 22 17:52:57.630675 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:57.630277 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" event={"ID":"34665fbc-4e15-4e8a-80f5-055377121541","Type":"ContainerDied","Data":"1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a"} Apr 22 17:52:57.630675 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:57.630303 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" event={"ID":"34665fbc-4e15-4e8a-80f5-055377121541","Type":"ContainerStarted","Data":"6cdd0f04cf1b0eb5dfdcaa2e78e1cd1e9438b18b0222699728473a79401f5092"} Apr 22 17:52:58.635593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:58.635561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" event={"ID":"34665fbc-4e15-4e8a-80f5-055377121541","Type":"ContainerStarted","Data":"eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857"} Apr 22 17:52:58.635593 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:58.635597 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" event={"ID":"34665fbc-4e15-4e8a-80f5-055377121541","Type":"ContainerStarted","Data":"8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0"} Apr 22 17:52:58.635982 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:58.635716 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:52:58.657678 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:52:58.657633 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" podStartSLOduration=2.657618122 podStartE2EDuration="2.657618122s" podCreationTimestamp="2026-04-22 17:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:52:58.655828642 +0000 UTC m=+1127.271920924" watchObservedRunningTime="2026-04-22 17:52:58.657618122 +0000 UTC m=+1127.273710393" Apr 22 17:53:06.724034 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:53:06.723994 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:53:06.724034 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:53:06.724032 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:53:06.726903 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:53:06.726875 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:53:07.666099 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:53:07.666062 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:53:28.670091 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:53:28.670016 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:54:12.606335 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:54:12.606302 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:54:12.606952 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:54:12.606929 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:55:05.511104 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:05.511071 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j"] Apr 22 17:55:05.511573 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:05.511377 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" podUID="34665fbc-4e15-4e8a-80f5-055377121541" containerName="main" containerID="cri-o://8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0" gracePeriod=30 Apr 22 17:55:05.511573 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:05.511410 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" podUID="34665fbc-4e15-4e8a-80f5-055377121541" containerName="tokenizer" containerID="cri-o://eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857" gracePeriod=30 Apr 22 17:55:06.065938 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.065900 2578 generic.go:358] "Generic (PLEG): container finished" podID="34665fbc-4e15-4e8a-80f5-055377121541" containerID="8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0" exitCode=0 Apr 22 17:55:06.066109 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.065969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" event={"ID":"34665fbc-4e15-4e8a-80f5-055377121541","Type":"ContainerDied","Data":"8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0"} Apr 22 17:55:06.664446 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.664400 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:55:06.754990 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.754930 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34665fbc-4e15-4e8a-80f5-055377121541-tls-certs\") pod \"34665fbc-4e15-4e8a-80f5-055377121541\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " Apr 22 17:55:06.754990 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.754968 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-cache\") pod \"34665fbc-4e15-4e8a-80f5-055377121541\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " Apr 22 17:55:06.754990 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.754987 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfvh2\" (UniqueName: \"kubernetes.io/projected/34665fbc-4e15-4e8a-80f5-055377121541-kube-api-access-xfvh2\") pod \"34665fbc-4e15-4e8a-80f5-055377121541\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " Apr 22 17:55:06.755168 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.755109 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-tmp\") pod \"34665fbc-4e15-4e8a-80f5-055377121541\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " Apr 22 17:55:06.755168 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.755151 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-uds\") pod \"34665fbc-4e15-4e8a-80f5-055377121541\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " Apr 22 17:55:06.755253 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.755224 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "34665fbc-4e15-4e8a-80f5-055377121541" (UID: "34665fbc-4e15-4e8a-80f5-055377121541"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:06.755253 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.755227 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-kserve-provision-location\") pod \"34665fbc-4e15-4e8a-80f5-055377121541\" (UID: \"34665fbc-4e15-4e8a-80f5-055377121541\") " Apr 22 17:55:06.755473 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.755451 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "34665fbc-4e15-4e8a-80f5-055377121541" (UID: "34665fbc-4e15-4e8a-80f5-055377121541"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:06.755569 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.755477 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "34665fbc-4e15-4e8a-80f5-055377121541" (UID: "34665fbc-4e15-4e8a-80f5-055377121541"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:06.755650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.755630 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-cache\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:55:06.755738 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.755654 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-tmp\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:55:06.755738 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.755663 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-tokenizer-uds\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:55:06.755843 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.755825 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34665fbc-4e15-4e8a-80f5-055377121541" (UID: "34665fbc-4e15-4e8a-80f5-055377121541"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:06.757492 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.757469 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34665fbc-4e15-4e8a-80f5-055377121541-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "34665fbc-4e15-4e8a-80f5-055377121541" (UID: "34665fbc-4e15-4e8a-80f5-055377121541"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:06.757563 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.757486 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34665fbc-4e15-4e8a-80f5-055377121541-kube-api-access-xfvh2" (OuterVolumeSpecName: "kube-api-access-xfvh2") pod "34665fbc-4e15-4e8a-80f5-055377121541" (UID: "34665fbc-4e15-4e8a-80f5-055377121541"). InnerVolumeSpecName "kube-api-access-xfvh2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:55:06.856310 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.856288 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34665fbc-4e15-4e8a-80f5-055377121541-kserve-provision-location\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:55:06.856310 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.856309 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34665fbc-4e15-4e8a-80f5-055377121541-tls-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:55:06.856451 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:06.856320 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfvh2\" (UniqueName: \"kubernetes.io/projected/34665fbc-4e15-4e8a-80f5-055377121541-kube-api-access-xfvh2\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:55:07.071220 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.071146 2578 generic.go:358] "Generic (PLEG): container finished" podID="34665fbc-4e15-4e8a-80f5-055377121541" containerID="eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857" exitCode=0 Apr 22 17:55:07.071220 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.071212 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" Apr 22 17:55:07.071391 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.071234 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" event={"ID":"34665fbc-4e15-4e8a-80f5-055377121541","Type":"ContainerDied","Data":"eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857"} Apr 22 17:55:07.071391 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.071272 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j" event={"ID":"34665fbc-4e15-4e8a-80f5-055377121541","Type":"ContainerDied","Data":"6cdd0f04cf1b0eb5dfdcaa2e78e1cd1e9438b18b0222699728473a79401f5092"} Apr 22 17:55:07.071391 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.071288 2578 scope.go:117] "RemoveContainer" containerID="eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857" Apr 22 17:55:07.080497 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.080480 2578 scope.go:117] "RemoveContainer" containerID="8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0" Apr 22 17:55:07.087602 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.087587 2578 scope.go:117] "RemoveContainer" containerID="1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a" Apr 22 17:55:07.095145 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.095125 2578 scope.go:117] "RemoveContainer" containerID="eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857" Apr 22 17:55:07.095404 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:55:07.095384 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857\": container with ID starting with eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857 not found: ID does not exist" containerID="eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857" Apr 22 17:55:07.095506 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.095415 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857"} err="failed to get container status \"eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857\": rpc error: code = NotFound desc = could not find container \"eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857\": container with ID starting with eec56d2b2c9436e5cced7b59f48575fa39f19fdaaa9b1f67774c67212ddf6857 not found: ID does not exist" Apr 22 17:55:07.095506 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.095455 2578 scope.go:117] "RemoveContainer" containerID="8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0" Apr 22 17:55:07.095691 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.095670 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j"] Apr 22 17:55:07.095740 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:55:07.095694 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0\": container with ID starting with 8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0 not found: ID does not exist" containerID="8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0" Apr 22 17:55:07.095777 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.095734 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0"} err="failed to get container status \"8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0\": rpc error: code = NotFound desc = could not find container \"8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0\": container with ID starting with 8991eb5ed3fa412dbc1b3dc09bd583493238760b3519c88fb79183b6c46c56f0 not found: ID does not exist" Apr 22 17:55:07.095777 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.095756 2578 scope.go:117] "RemoveContainer" containerID="1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a" Apr 22 17:55:07.096049 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:55:07.096030 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a\": container with ID starting with 1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a not found: ID does not exist" containerID="1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a" Apr 22 17:55:07.096124 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.096052 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a"} err="failed to get container status \"1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a\": rpc error: code = NotFound desc = could not find container \"1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a\": container with ID starting with 1a87fe022fb7d8f38b1faf32b75330e5ad06d806ee4525e72d019e210af7a09a not found: ID does not exist" Apr 22 17:55:07.098670 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.098651 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schenjp8j"] Apr 22 17:55:07.955826 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:07.955794 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34665fbc-4e15-4e8a-80f5-055377121541" path="/var/lib/kubelet/pods/34665fbc-4e15-4e8a-80f5-055377121541/volumes" Apr 22 17:55:18.760717 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.760659 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8"] Apr 22 17:55:18.761272 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.761202 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34665fbc-4e15-4e8a-80f5-055377121541" containerName="tokenizer" Apr 22 17:55:18.761272 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.761217 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="34665fbc-4e15-4e8a-80f5-055377121541" containerName="tokenizer" Apr 22 17:55:18.761272 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.761248 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34665fbc-4e15-4e8a-80f5-055377121541" containerName="storage-initializer" Apr 22 17:55:18.761272 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.761258 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="34665fbc-4e15-4e8a-80f5-055377121541" containerName="storage-initializer" Apr 22 17:55:18.761272 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.761271 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34665fbc-4e15-4e8a-80f5-055377121541" containerName="main" Apr 22 17:55:18.761589 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.761280 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="34665fbc-4e15-4e8a-80f5-055377121541" containerName="main" Apr 22 17:55:18.761589 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.761356 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="34665fbc-4e15-4e8a-80f5-055377121541" containerName="main" Apr 22 17:55:18.761589 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.761369 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="34665fbc-4e15-4e8a-80f5-055377121541" containerName="tokenizer" Apr 22 17:55:18.766373 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.766352 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.771850 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.771822 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:55:18.772028 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.771941 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 17:55:18.772093 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.772051 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 17:55:18.772147 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.772134 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 17:55:18.772888 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.772865 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-v6zkh\"" Apr 22 17:55:18.781160 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.781135 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8"] Apr 22 17:55:18.856981 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.856941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsvsp\" (UniqueName: \"kubernetes.io/projected/385783b7-258e-4487-9559-6f07a682e1f8-kube-api-access-bsvsp\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.857288 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.857262 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.857450 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.857413 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/385783b7-258e-4487-9559-6f07a682e1f8-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.857623 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.857606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.857778 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.857762 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.857898 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.857883 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.958735 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.958695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/385783b7-258e-4487-9559-6f07a682e1f8-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.959058 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.959036 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.959238 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.959223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.959766 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.959564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.959766 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.959620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.960244 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.960204 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.961151 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.959696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.961151 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.960394 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsvsp\" (UniqueName: \"kubernetes.io/projected/385783b7-258e-4487-9559-6f07a682e1f8-kube-api-access-bsvsp\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.961151 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.960476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.961151 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.960816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.963277 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.963230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/385783b7-258e-4487-9559-6f07a682e1f8-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:18.968828 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:18.968798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsvsp\" (UniqueName: \"kubernetes.io/projected/385783b7-258e-4487-9559-6f07a682e1f8-kube-api-access-bsvsp\") pod \"custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:19.077204 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:19.077128 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:19.239493 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:19.237833 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8"] Apr 22 17:55:19.250021 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:55:19.249991 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod385783b7_258e_4487_9559_6f07a682e1f8.slice/crio-9775b84a1faf08ff3eeb50906f321829f05b516b96b2aa3b8647919cce3b73ed WatchSource:0}: Error finding container 9775b84a1faf08ff3eeb50906f321829f05b516b96b2aa3b8647919cce3b73ed: Status 404 returned error can't find the container with id 9775b84a1faf08ff3eeb50906f321829f05b516b96b2aa3b8647919cce3b73ed Apr 22 17:55:19.252001 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:19.251983 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:55:20.129626 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:20.129586 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" event={"ID":"385783b7-258e-4487-9559-6f07a682e1f8","Type":"ContainerStarted","Data":"b028c406641fdf7bfdf615aba8b12dfd3b05640d70e688810014a218edecd717"} Apr 22 17:55:20.129626 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:20.129627 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" event={"ID":"385783b7-258e-4487-9559-6f07a682e1f8","Type":"ContainerStarted","Data":"9775b84a1faf08ff3eeb50906f321829f05b516b96b2aa3b8647919cce3b73ed"} Apr 22 17:55:21.134003 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:21.133968 2578 generic.go:358] "Generic (PLEG): container finished" podID="385783b7-258e-4487-9559-6f07a682e1f8" containerID="b028c406641fdf7bfdf615aba8b12dfd3b05640d70e688810014a218edecd717" exitCode=0 Apr 22 17:55:21.134358 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:21.134048 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" event={"ID":"385783b7-258e-4487-9559-6f07a682e1f8","Type":"ContainerDied","Data":"b028c406641fdf7bfdf615aba8b12dfd3b05640d70e688810014a218edecd717"} Apr 22 17:55:22.140029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:22.139994 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" event={"ID":"385783b7-258e-4487-9559-6f07a682e1f8","Type":"ContainerStarted","Data":"400eeea76e0375c2baa2fb3bf1b2e70c2d52e6f0383adc0475271c8292357f4b"} Apr 22 17:55:22.140029 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:22.140035 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" event={"ID":"385783b7-258e-4487-9559-6f07a682e1f8","Type":"ContainerStarted","Data":"c4b7fd630ba3c32c6d9a3c95705c926e87b539d9a70c4ace75a0b31b2c4afd35"} Apr 22 17:55:22.140584 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:22.140164 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:29.077457 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:29.077399 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:29.077969 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:29.077468 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:29.080175 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:29.080144 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:55:29.102434 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:29.102379 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" podStartSLOduration=11.102365754 podStartE2EDuration="11.102365754s" podCreationTimestamp="2026-04-22 17:55:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:55:22.164468996 +0000 UTC m=+1270.780561267" watchObservedRunningTime="2026-04-22 17:55:29.102365754 +0000 UTC m=+1277.718458025" Apr 22 17:55:29.162468 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:55:29.162413 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:56:00.166678 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:00.166648 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:56:18.742007 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:18.741970 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb"] Apr 22 17:56:18.742557 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:18.742230 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" podUID="c7a40edd-751b-4af7-826c-4f2e8845c135" containerName="manager" containerID="cri-o://a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4" gracePeriod=30 Apr 22 17:56:18.995305 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:18.995247 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:56:19.161665 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.161631 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkt4l\" (UniqueName: \"kubernetes.io/projected/c7a40edd-751b-4af7-826c-4f2e8845c135-kube-api-access-kkt4l\") pod \"c7a40edd-751b-4af7-826c-4f2e8845c135\" (UID: \"c7a40edd-751b-4af7-826c-4f2e8845c135\") " Apr 22 17:56:19.161846 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.161771 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7a40edd-751b-4af7-826c-4f2e8845c135-cert\") pod \"c7a40edd-751b-4af7-826c-4f2e8845c135\" (UID: \"c7a40edd-751b-4af7-826c-4f2e8845c135\") " Apr 22 17:56:19.163649 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.163624 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a40edd-751b-4af7-826c-4f2e8845c135-kube-api-access-kkt4l" (OuterVolumeSpecName: "kube-api-access-kkt4l") pod "c7a40edd-751b-4af7-826c-4f2e8845c135" (UID: "c7a40edd-751b-4af7-826c-4f2e8845c135"). InnerVolumeSpecName "kube-api-access-kkt4l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:56:19.163773 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.163735 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a40edd-751b-4af7-826c-4f2e8845c135-cert" (OuterVolumeSpecName: "cert") pod "c7a40edd-751b-4af7-826c-4f2e8845c135" (UID: "c7a40edd-751b-4af7-826c-4f2e8845c135"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:56:19.262833 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.262776 2578 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7a40edd-751b-4af7-826c-4f2e8845c135-cert\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:56:19.262833 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.262799 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kkt4l\" (UniqueName: \"kubernetes.io/projected/c7a40edd-751b-4af7-826c-4f2e8845c135-kube-api-access-kkt4l\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:56:19.328371 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.328343 2578 generic.go:358] "Generic (PLEG): container finished" podID="c7a40edd-751b-4af7-826c-4f2e8845c135" containerID="a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4" exitCode=0 Apr 22 17:56:19.328497 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.328410 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" Apr 22 17:56:19.328497 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.328439 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" event={"ID":"c7a40edd-751b-4af7-826c-4f2e8845c135","Type":"ContainerDied","Data":"a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4"} Apr 22 17:56:19.328497 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.328474 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb" event={"ID":"c7a40edd-751b-4af7-826c-4f2e8845c135","Type":"ContainerDied","Data":"0c605892995bfeee8f96a5a0c3a4e22da400c9b4e4d937ed6f72257489460efc"} Apr 22 17:56:19.328497 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.328490 2578 scope.go:117] "RemoveContainer" containerID="a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4" Apr 22 17:56:19.337661 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.337646 2578 scope.go:117] "RemoveContainer" containerID="a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4" Apr 22 17:56:19.337915 ip-10-0-143-54 kubenswrapper[2578]: E0422 17:56:19.337895 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4\": container with ID starting with a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4 not found: ID does not exist" containerID="a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4" Apr 22 17:56:19.337973 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.337926 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4"} err="failed to get container status \"a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4\": rpc error: code = NotFound desc = could not find container \"a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4\": container with ID starting with a2a246a07bedd8a0bd6b427989fdf24d09b0751f1bbc3be015a9f5dca404dbd4 not found: ID does not exist" Apr 22 17:56:19.351430 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.351396 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb"] Apr 22 17:56:19.357493 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.357472 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-6954c7fbdf-gcrkb"] Apr 22 17:56:19.956225 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:56:19.956195 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a40edd-751b-4af7-826c-4f2e8845c135" path="/var/lib/kubelet/pods/c7a40edd-751b-4af7-826c-4f2e8845c135/volumes" Apr 22 17:57:19.384850 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:19.384809 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8"] Apr 22 17:57:19.385395 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:19.385144 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="main" containerID="cri-o://c4b7fd630ba3c32c6d9a3c95705c926e87b539d9a70c4ace75a0b31b2c4afd35" gracePeriod=30 Apr 22 17:57:19.385395 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:19.385194 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="tokenizer" containerID="cri-o://400eeea76e0375c2baa2fb3bf1b2e70c2d52e6f0383adc0475271c8292357f4b" gracePeriod=30 Apr 22 17:57:19.528817 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:19.528782 2578 generic.go:358] "Generic (PLEG): container finished" podID="385783b7-258e-4487-9559-6f07a682e1f8" containerID="c4b7fd630ba3c32c6d9a3c95705c926e87b539d9a70c4ace75a0b31b2c4afd35" exitCode=0 Apr 22 17:57:19.528974 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:19.528852 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" event={"ID":"385783b7-258e-4487-9559-6f07a682e1f8","Type":"ContainerDied","Data":"c4b7fd630ba3c32c6d9a3c95705c926e87b539d9a70c4ace75a0b31b2c4afd35"} Apr 22 17:57:20.165514 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:57:20.165481 2578 logging.go:55] [core] [Channel #164 SubChannel #165]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.36:9003", ServerName: "10.133.0.36:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.36:9003: connect: connection refused" Apr 22 17:57:20.533796 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.533762 2578 generic.go:358] "Generic (PLEG): container finished" podID="385783b7-258e-4487-9559-6f07a682e1f8" containerID="400eeea76e0375c2baa2fb3bf1b2e70c2d52e6f0383adc0475271c8292357f4b" exitCode=0 Apr 22 17:57:20.534246 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.533845 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" event={"ID":"385783b7-258e-4487-9559-6f07a682e1f8","Type":"ContainerDied","Data":"400eeea76e0375c2baa2fb3bf1b2e70c2d52e6f0383adc0475271c8292357f4b"} Apr 22 17:57:20.641444 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.641410 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:57:20.689568 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.689537 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-kserve-provision-location\") pod \"385783b7-258e-4487-9559-6f07a682e1f8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " Apr 22 17:57:20.689742 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.689589 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-uds\") pod \"385783b7-258e-4487-9559-6f07a682e1f8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " Apr 22 17:57:20.689742 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.689640 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-cache\") pod \"385783b7-258e-4487-9559-6f07a682e1f8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " Apr 22 17:57:20.689742 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.689667 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-tmp\") pod \"385783b7-258e-4487-9559-6f07a682e1f8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " Apr 22 17:57:20.689742 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.689717 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsvsp\" (UniqueName: \"kubernetes.io/projected/385783b7-258e-4487-9559-6f07a682e1f8-kube-api-access-bsvsp\") pod \"385783b7-258e-4487-9559-6f07a682e1f8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " Apr 22 17:57:20.689942 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.689742 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/385783b7-258e-4487-9559-6f07a682e1f8-tls-certs\") pod \"385783b7-258e-4487-9559-6f07a682e1f8\" (UID: \"385783b7-258e-4487-9559-6f07a682e1f8\") " Apr 22 17:57:20.689942 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.689907 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "385783b7-258e-4487-9559-6f07a682e1f8" (UID: "385783b7-258e-4487-9559-6f07a682e1f8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:20.690050 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.689958 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "385783b7-258e-4487-9559-6f07a682e1f8" (UID: "385783b7-258e-4487-9559-6f07a682e1f8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:20.690050 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.690041 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-uds\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:57:20.690154 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.690042 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "385783b7-258e-4487-9559-6f07a682e1f8" (UID: "385783b7-258e-4487-9559-6f07a682e1f8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:20.690154 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.690060 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-cache\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:57:20.690409 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.690383 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "385783b7-258e-4487-9559-6f07a682e1f8" (UID: "385783b7-258e-4487-9559-6f07a682e1f8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:20.691835 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.691814 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385783b7-258e-4487-9559-6f07a682e1f8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "385783b7-258e-4487-9559-6f07a682e1f8" (UID: "385783b7-258e-4487-9559-6f07a682e1f8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:20.692064 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.692038 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385783b7-258e-4487-9559-6f07a682e1f8-kube-api-access-bsvsp" (OuterVolumeSpecName: "kube-api-access-bsvsp") pod "385783b7-258e-4487-9559-6f07a682e1f8" (UID: "385783b7-258e-4487-9559-6f07a682e1f8"). InnerVolumeSpecName "kube-api-access-bsvsp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:20.791527 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.791402 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bsvsp\" (UniqueName: \"kubernetes.io/projected/385783b7-258e-4487-9559-6f07a682e1f8-kube-api-access-bsvsp\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:57:20.791527 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.791482 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/385783b7-258e-4487-9559-6f07a682e1f8-tls-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:57:20.791527 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.791499 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-kserve-provision-location\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:57:20.791527 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:20.791514 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/385783b7-258e-4487-9559-6f07a682e1f8-tokenizer-tmp\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 17:57:21.166028 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:21.165941 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.36:9003\" within 1s: context deadline exceeded" Apr 22 17:57:21.541401 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:21.541361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" event={"ID":"385783b7-258e-4487-9559-6f07a682e1f8","Type":"ContainerDied","Data":"9775b84a1faf08ff3eeb50906f321829f05b516b96b2aa3b8647919cce3b73ed"} Apr 22 17:57:21.541838 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:21.541432 2578 scope.go:117] "RemoveContainer" containerID="400eeea76e0375c2baa2fb3bf1b2e70c2d52e6f0383adc0475271c8292357f4b" Apr 22 17:57:21.541838 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:21.541435 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8" Apr 22 17:57:21.549846 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:21.549826 2578 scope.go:117] "RemoveContainer" containerID="c4b7fd630ba3c32c6d9a3c95705c926e87b539d9a70c4ace75a0b31b2c4afd35" Apr 22 17:57:21.557106 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:21.557089 2578 scope.go:117] "RemoveContainer" containerID="b028c406641fdf7bfdf615aba8b12dfd3b05640d70e688810014a218edecd717" Apr 22 17:57:21.563879 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:21.563857 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8"] Apr 22 17:57:21.566998 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:21.566978 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-87c99f67q64n8"] Apr 22 17:57:21.956194 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:21.956120 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="385783b7-258e-4487-9559-6f07a682e1f8" path="/var/lib/kubelet/pods/385783b7-258e-4487-9559-6f07a682e1f8/volumes" Apr 22 17:57:31.093850 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.093811 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b"] Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094190 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="main" Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094202 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="main" Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094211 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7a40edd-751b-4af7-826c-4f2e8845c135" containerName="manager" Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094217 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a40edd-751b-4af7-826c-4f2e8845c135" containerName="manager" Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094224 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="storage-initializer" Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094230 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="storage-initializer" Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094239 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="tokenizer" Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094245 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="tokenizer" Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094300 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7a40edd-751b-4af7-826c-4f2e8845c135" containerName="manager" Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094310 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="main" Apr 22 17:57:31.094334 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.094320 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="385783b7-258e-4487-9559-6f07a682e1f8" containerName="tokenizer" Apr 22 17:57:31.099192 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.099171 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.103400 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.103374 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 17:57:31.103553 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.103373 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 17:57:31.103650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.103627 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 17:57:31.104087 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.104070 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-bm27w\"" Apr 22 17:57:31.104130 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.104095 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:57:31.112911 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.112882 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b"] Apr 22 17:57:31.161650 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.161614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q56m\" (UniqueName: \"kubernetes.io/projected/141d725f-6ca6-46e4-8588-97bfef153936-kube-api-access-2q56m\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.161846 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.161669 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.161846 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.161716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.161846 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.161754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/141d725f-6ca6-46e4-8588-97bfef153936-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.161846 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.161781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.161846 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.161812 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.262170 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.262142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2q56m\" (UniqueName: \"kubernetes.io/projected/141d725f-6ca6-46e4-8588-97bfef153936-kube-api-access-2q56m\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.262327 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.262180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.262327 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.262207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.262483 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.262342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/141d725f-6ca6-46e4-8588-97bfef153936-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.262483 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.262381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.262581 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.262526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.262668 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.262649 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.262731 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.262711 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.262873 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.262854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.262931 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.262886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.264850 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.264823 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/141d725f-6ca6-46e4-8588-97bfef153936-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.273944 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.273919 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q56m\" (UniqueName: \"kubernetes.io/projected/141d725f-6ca6-46e4-8588-97bfef153936-kube-api-access-2q56m\") pod \"router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.409109 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.409052 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:31.549991 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.549958 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b"] Apr 22 17:57:31.555470 ip-10-0-143-54 kubenswrapper[2578]: W0422 17:57:31.555412 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod141d725f_6ca6_46e4_8588_97bfef153936.slice/crio-6925e4652ccb6a5d5a0e4fc3b286192e0507c3beff0a6934d24fd8328f1452e5 WatchSource:0}: Error finding container 6925e4652ccb6a5d5a0e4fc3b286192e0507c3beff0a6934d24fd8328f1452e5: Status 404 returned error can't find the container with id 6925e4652ccb6a5d5a0e4fc3b286192e0507c3beff0a6934d24fd8328f1452e5 Apr 22 17:57:31.575304 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:31.575280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" event={"ID":"141d725f-6ca6-46e4-8588-97bfef153936","Type":"ContainerStarted","Data":"6925e4652ccb6a5d5a0e4fc3b286192e0507c3beff0a6934d24fd8328f1452e5"} Apr 22 17:57:32.579898 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:32.579815 2578 generic.go:358] "Generic (PLEG): container finished" podID="141d725f-6ca6-46e4-8588-97bfef153936" containerID="efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977" exitCode=0 Apr 22 17:57:32.580237 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:32.579902 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" event={"ID":"141d725f-6ca6-46e4-8588-97bfef153936","Type":"ContainerDied","Data":"efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977"} Apr 22 17:57:33.585403 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:33.585370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" event={"ID":"141d725f-6ca6-46e4-8588-97bfef153936","Type":"ContainerStarted","Data":"ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135"} Apr 22 17:57:33.585403 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:33.585406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" event={"ID":"141d725f-6ca6-46e4-8588-97bfef153936","Type":"ContainerStarted","Data":"c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a"} Apr 22 17:57:33.585871 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:33.585502 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:33.608086 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:33.608040 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" podStartSLOduration=2.608026818 podStartE2EDuration="2.608026818s" podCreationTimestamp="2026-04-22 17:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:57:33.60687944 +0000 UTC m=+1402.222971726" watchObservedRunningTime="2026-04-22 17:57:33.608026818 +0000 UTC m=+1402.224119124" Apr 22 17:57:41.409666 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:41.409632 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:41.410182 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:41.409684 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:41.412323 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:41.412301 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:57:41.617891 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:57:41.617862 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:58:02.622045 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:58:02.621969 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 17:59:12.629305 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:59:12.629275 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 17:59:12.630351 ip-10-0-143-54 kubenswrapper[2578]: I0422 17:59:12.630325 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 18:00:30.387707 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:30.387668 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b"] Apr 22 18:00:30.390324 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:30.388066 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" podUID="141d725f-6ca6-46e4-8588-97bfef153936" containerName="main" containerID="cri-o://c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a" gracePeriod=30 Apr 22 18:00:30.390324 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:30.388123 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" podUID="141d725f-6ca6-46e4-8588-97bfef153936" containerName="tokenizer" containerID="cri-o://ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135" gracePeriod=30 Apr 22 18:00:31.189992 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.189956 2578 generic.go:358] "Generic (PLEG): container finished" podID="141d725f-6ca6-46e4-8588-97bfef153936" containerID="c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a" exitCode=0 Apr 22 18:00:31.190172 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.190017 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" event={"ID":"141d725f-6ca6-46e4-8588-97bfef153936","Type":"ContainerDied","Data":"c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a"} Apr 22 18:00:31.539590 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.539567 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 18:00:31.680876 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.680842 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-kserve-provision-location\") pod \"141d725f-6ca6-46e4-8588-97bfef153936\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " Apr 22 18:00:31.680876 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.680886 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-uds\") pod \"141d725f-6ca6-46e4-8588-97bfef153936\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " Apr 22 18:00:31.681115 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.680909 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q56m\" (UniqueName: \"kubernetes.io/projected/141d725f-6ca6-46e4-8588-97bfef153936-kube-api-access-2q56m\") pod \"141d725f-6ca6-46e4-8588-97bfef153936\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " Apr 22 18:00:31.681115 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.680942 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/141d725f-6ca6-46e4-8588-97bfef153936-tls-certs\") pod \"141d725f-6ca6-46e4-8588-97bfef153936\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " Apr 22 18:00:31.681115 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.680991 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-tmp\") pod \"141d725f-6ca6-46e4-8588-97bfef153936\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " Apr 22 18:00:31.681115 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.681014 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-cache\") pod \"141d725f-6ca6-46e4-8588-97bfef153936\" (UID: \"141d725f-6ca6-46e4-8588-97bfef153936\") " Apr 22 18:00:31.681317 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.681186 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "141d725f-6ca6-46e4-8588-97bfef153936" (UID: "141d725f-6ca6-46e4-8588-97bfef153936"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:00:31.681372 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.681344 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "141d725f-6ca6-46e4-8588-97bfef153936" (UID: "141d725f-6ca6-46e4-8588-97bfef153936"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:00:31.681372 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.681354 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "141d725f-6ca6-46e4-8588-97bfef153936" (UID: "141d725f-6ca6-46e4-8588-97bfef153936"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:00:31.681744 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.681717 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "141d725f-6ca6-46e4-8588-97bfef153936" (UID: "141d725f-6ca6-46e4-8588-97bfef153936"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:00:31.683103 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.683086 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/141d725f-6ca6-46e4-8588-97bfef153936-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "141d725f-6ca6-46e4-8588-97bfef153936" (UID: "141d725f-6ca6-46e4-8588-97bfef153936"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:00:31.683167 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.683154 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/141d725f-6ca6-46e4-8588-97bfef153936-kube-api-access-2q56m" (OuterVolumeSpecName: "kube-api-access-2q56m") pod "141d725f-6ca6-46e4-8588-97bfef153936" (UID: "141d725f-6ca6-46e4-8588-97bfef153936"). InnerVolumeSpecName "kube-api-access-2q56m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:00:31.782097 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.782063 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-uds\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:00:31.782097 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.782097 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2q56m\" (UniqueName: \"kubernetes.io/projected/141d725f-6ca6-46e4-8588-97bfef153936-kube-api-access-2q56m\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:00:31.782097 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.782108 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/141d725f-6ca6-46e4-8588-97bfef153936-tls-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:00:31.782279 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.782118 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-tmp\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:00:31.782279 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.782127 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-tokenizer-cache\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:00:31.782279 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:31.782135 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/141d725f-6ca6-46e4-8588-97bfef153936-kserve-provision-location\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:00:32.194878 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.194790 2578 generic.go:358] "Generic (PLEG): container finished" podID="141d725f-6ca6-46e4-8588-97bfef153936" containerID="ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135" exitCode=0 Apr 22 18:00:32.194878 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.194848 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" event={"ID":"141d725f-6ca6-46e4-8588-97bfef153936","Type":"ContainerDied","Data":"ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135"} Apr 22 18:00:32.195056 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.194881 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" event={"ID":"141d725f-6ca6-46e4-8588-97bfef153936","Type":"ContainerDied","Data":"6925e4652ccb6a5d5a0e4fc3b286192e0507c3beff0a6934d24fd8328f1452e5"} Apr 22 18:00:32.195056 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.194901 2578 scope.go:117] "RemoveContainer" containerID="ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135" Apr 22 18:00:32.195056 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.194902 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b" Apr 22 18:00:32.202577 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.202562 2578 scope.go:117] "RemoveContainer" containerID="c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a" Apr 22 18:00:32.209460 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.209444 2578 scope.go:117] "RemoveContainer" containerID="efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977" Apr 22 18:00:32.213464 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.213440 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b"] Apr 22 18:00:32.217094 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.217077 2578 scope.go:117] "RemoveContainer" containerID="ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135" Apr 22 18:00:32.217326 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.217308 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7f45485bd7-ds87b"] Apr 22 18:00:32.217389 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:00:32.217362 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135\": container with ID starting with ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135 not found: ID does not exist" containerID="ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135" Apr 22 18:00:32.217538 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.217402 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135"} err="failed to get container status \"ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135\": rpc error: code = NotFound desc = could not find container \"ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135\": container with ID starting with ef022290bf784523fd393d0fe2e979a5835512c2f3f3f97ba2b3d22e61c5e135 not found: ID does not exist" Apr 22 18:00:32.217538 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.217447 2578 scope.go:117] "RemoveContainer" containerID="c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a" Apr 22 18:00:32.217734 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:00:32.217715 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a\": container with ID starting with c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a not found: ID does not exist" containerID="c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a" Apr 22 18:00:32.217788 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.217748 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a"} err="failed to get container status \"c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a\": rpc error: code = NotFound desc = could not find container \"c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a\": container with ID starting with c243d840176aac56085f809ee79179fb96c72e1bad6ba69acb95af8f33a8215a not found: ID does not exist" Apr 22 18:00:32.217788 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.217765 2578 scope.go:117] "RemoveContainer" containerID="efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977" Apr 22 18:00:32.218014 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:00:32.217995 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977\": container with ID starting with efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977 not found: ID does not exist" containerID="efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977" Apr 22 18:00:32.218079 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:32.218026 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977"} err="failed to get container status \"efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977\": rpc error: code = NotFound desc = could not find container \"efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977\": container with ID starting with efc73f24e0ae8635cb531051e0fd795cf8334a757ceecb3686e3cf1f7698a977 not found: ID does not exist" Apr 22 18:00:33.956225 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:33.956191 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="141d725f-6ca6-46e4-8588-97bfef153936" path="/var/lib/kubelet/pods/141d725f-6ca6-46e4-8588-97bfef153936/volumes" Apr 22 18:00:46.063139 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.063103 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w"] Apr 22 18:00:46.063795 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.063505 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="141d725f-6ca6-46e4-8588-97bfef153936" containerName="storage-initializer" Apr 22 18:00:46.063795 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.063519 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="141d725f-6ca6-46e4-8588-97bfef153936" containerName="storage-initializer" Apr 22 18:00:46.063795 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.063538 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="141d725f-6ca6-46e4-8588-97bfef153936" containerName="main" Apr 22 18:00:46.063795 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.063543 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="141d725f-6ca6-46e4-8588-97bfef153936" containerName="main" Apr 22 18:00:46.063795 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.063552 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="141d725f-6ca6-46e4-8588-97bfef153936" containerName="tokenizer" Apr 22 18:00:46.063795 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.063558 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="141d725f-6ca6-46e4-8588-97bfef153936" containerName="tokenizer" Apr 22 18:00:46.063795 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.063613 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="141d725f-6ca6-46e4-8588-97bfef153936" containerName="main" Apr 22 18:00:46.063795 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.063621 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="141d725f-6ca6-46e4-8588-97bfef153936" containerName="tokenizer" Apr 22 18:00:46.067688 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.067670 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.071433 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.071400 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-d9ndm\"" Apr 22 18:00:46.071554 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.071401 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:00:46.071554 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.071402 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 18:00:46.071554 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.071403 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 18:00:46.071554 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.071506 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:00:46.078868 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.078847 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w"] Apr 22 18:00:46.209890 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.209853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.209890 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.209892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crcc7\" (UniqueName: \"kubernetes.io/projected/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kube-api-access-crcc7\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.210100 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.209955 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.210100 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.210070 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.210168 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.210101 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.210168 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.210125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.311320 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.311287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.311320 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.311325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.311584 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.311348 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.311584 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.311476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.311584 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.311516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crcc7\" (UniqueName: \"kubernetes.io/projected/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kube-api-access-crcc7\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.311584 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.311579 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.311794 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.311743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.311794 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.311763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.311903 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.311817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.311961 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.311944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.313816 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.313759 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.320545 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.320522 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crcc7\" (UniqueName: \"kubernetes.io/projected/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kube-api-access-crcc7\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.377631 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.377601 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:46.506479 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.506454 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w"] Apr 22 18:00:46.509002 ip-10-0-143-54 kubenswrapper[2578]: W0422 18:00:46.508962 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb29ad7f2_ea7e_49d9_9671_d8ffef3e7056.slice/crio-987ffe02224a1bd9d60f69da87fcf1467b5b6d4d18b2e1a9d0a36c86ed42213a WatchSource:0}: Error finding container 987ffe02224a1bd9d60f69da87fcf1467b5b6d4d18b2e1a9d0a36c86ed42213a: Status 404 returned error can't find the container with id 987ffe02224a1bd9d60f69da87fcf1467b5b6d4d18b2e1a9d0a36c86ed42213a Apr 22 18:00:46.510978 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:46.510959 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:00:47.249572 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:47.249526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" event={"ID":"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056","Type":"ContainerStarted","Data":"a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5"} Apr 22 18:00:47.249572 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:47.249578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" event={"ID":"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056","Type":"ContainerStarted","Data":"987ffe02224a1bd9d60f69da87fcf1467b5b6d4d18b2e1a9d0a36c86ed42213a"} Apr 22 18:00:48.254204 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:48.254171 2578 generic.go:358] "Generic (PLEG): container finished" podID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerID="a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5" exitCode=0 Apr 22 18:00:48.254623 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:48.254270 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" event={"ID":"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056","Type":"ContainerDied","Data":"a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5"} Apr 22 18:00:49.259681 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:49.259648 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" event={"ID":"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056","Type":"ContainerStarted","Data":"a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0"} Apr 22 18:00:49.259681 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:49.259685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" event={"ID":"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056","Type":"ContainerStarted","Data":"8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba"} Apr 22 18:00:49.260084 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:49.259819 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:49.284431 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:49.284383 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" podStartSLOduration=3.284369573 podStartE2EDuration="3.284369573s" podCreationTimestamp="2026-04-22 18:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:00:49.281696907 +0000 UTC m=+1597.897789182" watchObservedRunningTime="2026-04-22 18:00:49.284369573 +0000 UTC m=+1597.900461846" Apr 22 18:00:56.377759 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:56.377674 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:56.378137 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:56.377829 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:56.380569 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:56.380552 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:00:57.288837 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:00:57.288804 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:01:19.296643 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:01:19.296608 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:03:59.824033 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:03:59.823996 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w"] Apr 22 18:03:59.824505 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:03:59.824318 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerName="main" containerID="cri-o://8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba" gracePeriod=30 Apr 22 18:03:59.824505 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:03:59.824365 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerName="tokenizer" containerID="cri-o://a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0" gracePeriod=30 Apr 22 18:04:00.898990 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:00.898954 2578 generic.go:358] "Generic (PLEG): container finished" podID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerID="8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba" exitCode=0 Apr 22 18:04:00.899350 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:00.899029 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" event={"ID":"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056","Type":"ContainerDied","Data":"8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba"} Apr 22 18:04:01.073227 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.073202 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:04:01.172323 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.172262 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tls-certs\") pod \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " Apr 22 18:04:01.172323 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.172292 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-cache\") pod \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " Apr 22 18:04:01.172323 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.172308 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-uds\") pod \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " Apr 22 18:04:01.172600 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.172376 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kserve-provision-location\") pod \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " Apr 22 18:04:01.172600 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.172397 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crcc7\" (UniqueName: \"kubernetes.io/projected/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kube-api-access-crcc7\") pod \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " Apr 22 18:04:01.172600 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.172438 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-tmp\") pod \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\" (UID: \"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056\") " Apr 22 18:04:01.172600 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.172589 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" (UID: "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:01.172808 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.172709 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-cache\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:04:01.172808 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.172717 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" (UID: "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:01.172876 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.172851 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" (UID: "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:01.173156 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.173137 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" (UID: "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:04:01.174364 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.174345 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" (UID: "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:04:01.174417 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.174381 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kube-api-access-crcc7" (OuterVolumeSpecName: "kube-api-access-crcc7") pod "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" (UID: "b29ad7f2-ea7e-49d9-9671-d8ffef3e7056"). InnerVolumeSpecName "kube-api-access-crcc7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:04:01.273494 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.273466 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kserve-provision-location\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:04:01.273494 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.273491 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crcc7\" (UniqueName: \"kubernetes.io/projected/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-kube-api-access-crcc7\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:04:01.273640 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.273506 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-tmp\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:04:01.273640 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.273520 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tls-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:04:01.273640 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.273532 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056-tokenizer-uds\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:04:01.903597 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.903563 2578 generic.go:358] "Generic (PLEG): container finished" podID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerID="a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0" exitCode=0 Apr 22 18:04:01.904027 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.903631 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" Apr 22 18:04:01.904027 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.903645 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" event={"ID":"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056","Type":"ContainerDied","Data":"a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0"} Apr 22 18:04:01.904027 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.903682 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w" event={"ID":"b29ad7f2-ea7e-49d9-9671-d8ffef3e7056","Type":"ContainerDied","Data":"987ffe02224a1bd9d60f69da87fcf1467b5b6d4d18b2e1a9d0a36c86ed42213a"} Apr 22 18:04:01.904027 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.903697 2578 scope.go:117] "RemoveContainer" containerID="a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0" Apr 22 18:04:01.913888 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.913538 2578 scope.go:117] "RemoveContainer" containerID="8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba" Apr 22 18:04:01.921173 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.921158 2578 scope.go:117] "RemoveContainer" containerID="a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5" Apr 22 18:04:01.927792 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.927753 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w"] Apr 22 18:04:01.931555 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.931533 2578 scope.go:117] "RemoveContainer" containerID="a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0" Apr 22 18:04:01.931850 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:04:01.931827 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0\": container with ID starting with a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0 not found: ID does not exist" containerID="a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0" Apr 22 18:04:01.931966 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.931863 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0"} err="failed to get container status \"a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0\": rpc error: code = NotFound desc = could not find container \"a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0\": container with ID starting with a45b4ebc004169bab13f677463b81576bba8adba324f9a33a185687c02ecefc0 not found: ID does not exist" Apr 22 18:04:01.931966 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.931885 2578 scope.go:117] "RemoveContainer" containerID="8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba" Apr 22 18:04:01.932128 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:04:01.932112 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba\": container with ID starting with 8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba not found: ID does not exist" containerID="8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba" Apr 22 18:04:01.932185 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.932131 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba"} err="failed to get container status \"8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba\": rpc error: code = NotFound desc = could not find container \"8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba\": container with ID starting with 8baa8879168aeb2d2ce60b59d2d2942b6ab1f65d2b1af6f7e12fa53350f696ba not found: ID does not exist" Apr 22 18:04:01.932185 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.932144 2578 scope.go:117] "RemoveContainer" containerID="a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5" Apr 22 18:04:01.932349 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:04:01.932332 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5\": container with ID starting with a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5 not found: ID does not exist" containerID="a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5" Apr 22 18:04:01.932406 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.932353 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5"} err="failed to get container status \"a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5\": rpc error: code = NotFound desc = could not find container \"a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5\": container with ID starting with a7547bde1be541b940010e340e9543f1432b41514ae3ef293d1084b6dad075c5 not found: ID does not exist" Apr 22 18:04:01.932783 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.932763 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche4lz6w"] Apr 22 18:04:01.954906 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:01.954885 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" path="/var/lib/kubelet/pods/b29ad7f2-ea7e-49d9-9671-d8ffef3e7056/volumes" Apr 22 18:04:08.185086 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.185056 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld"] Apr 22 18:04:08.185670 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.185510 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerName="tokenizer" Apr 22 18:04:08.185670 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.185528 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerName="tokenizer" Apr 22 18:04:08.185670 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.185546 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerName="main" Apr 22 18:04:08.185670 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.185552 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerName="main" Apr 22 18:04:08.185670 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.185560 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerName="storage-initializer" Apr 22 18:04:08.185670 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.185565 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerName="storage-initializer" Apr 22 18:04:08.185670 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.185629 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerName="main" Apr 22 18:04:08.185670 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.185637 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b29ad7f2-ea7e-49d9-9671-d8ffef3e7056" containerName="tokenizer" Apr 22 18:04:08.188769 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.188743 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.192371 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.192347 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 18:04:08.192569 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.192548 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:04:08.192569 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.192560 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 18:04:08.192724 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.192554 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:04:08.192724 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.192691 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-d9fwg\"" Apr 22 18:04:08.199275 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.199252 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld"] Apr 22 18:04:08.329261 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.329233 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.329261 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.329263 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.329464 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.329320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95b53f51-1965-4b1a-b73a-974c40f5ac96-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.329464 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.329351 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.329464 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.329389 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t56z8\" (UniqueName: \"kubernetes.io/projected/95b53f51-1965-4b1a-b73a-974c40f5ac96-kube-api-access-t56z8\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.329575 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.329479 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.430903 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.430860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.431040 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.430985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.431040 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.431029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.431172 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.431153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95b53f51-1965-4b1a-b73a-974c40f5ac96-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.431231 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.431197 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.431231 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.431213 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.431343 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.431243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t56z8\" (UniqueName: \"kubernetes.io/projected/95b53f51-1965-4b1a-b73a-974c40f5ac96-kube-api-access-t56z8\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.431343 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.431277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.431479 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.431456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.431527 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.431491 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.433567 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.433549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95b53f51-1965-4b1a-b73a-974c40f5ac96-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.439792 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.439741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t56z8\" (UniqueName: \"kubernetes.io/projected/95b53f51-1965-4b1a-b73a-974c40f5ac96-kube-api-access-t56z8\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.499766 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.499732 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:08.622828 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.622802 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld"] Apr 22 18:04:08.624901 ip-10-0-143-54 kubenswrapper[2578]: W0422 18:04:08.624870 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95b53f51_1965_4b1a_b73a_974c40f5ac96.slice/crio-c85ebc9596fbea4a22151164f4befd4ba3d6a64bd8f2bbc5ac8952f948ce5e94 WatchSource:0}: Error finding container c85ebc9596fbea4a22151164f4befd4ba3d6a64bd8f2bbc5ac8952f948ce5e94: Status 404 returned error can't find the container with id c85ebc9596fbea4a22151164f4befd4ba3d6a64bd8f2bbc5ac8952f948ce5e94 Apr 22 18:04:08.929118 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.929078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" event={"ID":"95b53f51-1965-4b1a-b73a-974c40f5ac96","Type":"ContainerStarted","Data":"c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97"} Apr 22 18:04:08.929118 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:08.929120 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" event={"ID":"95b53f51-1965-4b1a-b73a-974c40f5ac96","Type":"ContainerStarted","Data":"c85ebc9596fbea4a22151164f4befd4ba3d6a64bd8f2bbc5ac8952f948ce5e94"} Apr 22 18:04:09.934147 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:09.934107 2578 generic.go:358] "Generic (PLEG): container finished" podID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerID="c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97" exitCode=0 Apr 22 18:04:09.934581 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:09.934158 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" event={"ID":"95b53f51-1965-4b1a-b73a-974c40f5ac96","Type":"ContainerDied","Data":"c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97"} Apr 22 18:04:10.942590 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:10.942550 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" event={"ID":"95b53f51-1965-4b1a-b73a-974c40f5ac96","Type":"ContainerStarted","Data":"e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc"} Apr 22 18:04:10.942590 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:10.942594 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" event={"ID":"95b53f51-1965-4b1a-b73a-974c40f5ac96","Type":"ContainerStarted","Data":"c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702"} Apr 22 18:04:10.943016 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:10.942702 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:10.967520 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:10.967471 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" podStartSLOduration=2.967458749 podStartE2EDuration="2.967458749s" podCreationTimestamp="2026-04-22 18:04:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:04:10.965295778 +0000 UTC m=+1799.581388052" watchObservedRunningTime="2026-04-22 18:04:10.967458749 +0000 UTC m=+1799.583551021" Apr 22 18:04:12.654403 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:12.654369 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 18:04:12.656482 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:12.656463 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 18:04:18.500916 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.500877 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:18.500916 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.500921 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:18.503570 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.503546 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:18.567581 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.565790 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:04:18.573196 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.573172 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.575924 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.575900 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 18:04:18.576043 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.575956 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-t5nh2\"" Apr 22 18:04:18.582489 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.582470 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:04:18.723218 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.723187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.723218 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.723217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10de64b5-00e7-4b3a-92c8-e8ae2c483188-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.723385 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.723249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.723385 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.723328 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.723385 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.723367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.723516 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.723388 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljg2\" (UniqueName: \"kubernetes.io/projected/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kube-api-access-tljg2\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.824396 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.824327 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.824396 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.824377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.824546 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.824444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.824546 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.824481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tljg2\" (UniqueName: \"kubernetes.io/projected/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kube-api-access-tljg2\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.824546 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.824541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.824699 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.824568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10de64b5-00e7-4b3a-92c8-e8ae2c483188-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.824764 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.824745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.824830 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.824812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.824885 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.824850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.826745 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.826728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.826931 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.826916 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10de64b5-00e7-4b3a-92c8-e8ae2c483188-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.833136 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.833112 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tljg2\" (UniqueName: \"kubernetes.io/projected/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kube-api-access-tljg2\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.884225 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.884203 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:04:18.973783 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:18.973756 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:04:19.005409 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:19.005386 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:04:19.008061 ip-10-0-143-54 kubenswrapper[2578]: W0422 18:04:19.008031 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10de64b5_00e7_4b3a_92c8_e8ae2c483188.slice/crio-8469c911fd3570b1acffcffb543f2910d500ba1991420792047585a638a2587a WatchSource:0}: Error finding container 8469c911fd3570b1acffcffb543f2910d500ba1991420792047585a638a2587a: Status 404 returned error can't find the container with id 8469c911fd3570b1acffcffb543f2910d500ba1991420792047585a638a2587a Apr 22 18:04:19.976285 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:19.976245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"10de64b5-00e7-4b3a-92c8-e8ae2c483188","Type":"ContainerStarted","Data":"756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4"} Apr 22 18:04:19.976285 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:19.976287 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"10de64b5-00e7-4b3a-92c8-e8ae2c483188","Type":"ContainerStarted","Data":"8469c911fd3570b1acffcffb543f2910d500ba1991420792047585a638a2587a"} Apr 22 18:04:23.991719 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:23.991686 2578 generic.go:358] "Generic (PLEG): container finished" podID="10de64b5-00e7-4b3a-92c8-e8ae2c483188" containerID="756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4" exitCode=0 Apr 22 18:04:23.992109 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:23.991750 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"10de64b5-00e7-4b3a-92c8-e8ae2c483188","Type":"ContainerDied","Data":"756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4"} Apr 22 18:04:39.978522 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:04:39.978492 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:05:13.174517 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:05:13.174479 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"10de64b5-00e7-4b3a-92c8-e8ae2c483188","Type":"ContainerStarted","Data":"6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db"} Apr 22 18:05:13.207221 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:05:13.207168 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.609168737 podStartE2EDuration="55.207152992s" podCreationTimestamp="2026-04-22 18:04:18 +0000 UTC" firstStartedPulling="2026-04-22 18:04:23.992844503 +0000 UTC m=+1812.608936754" lastFinishedPulling="2026-04-22 18:05:12.590828756 +0000 UTC m=+1861.206921009" observedRunningTime="2026-04-22 18:05:13.205595943 +0000 UTC m=+1861.821688218" watchObservedRunningTime="2026-04-22 18:05:13.207152992 +0000 UTC m=+1861.823245263" Apr 22 18:07:21.404900 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:21.404812 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld"] Apr 22 18:07:21.405412 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:21.405081 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerName="main" containerID="cri-o://c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702" gracePeriod=30 Apr 22 18:07:21.405412 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:21.405137 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerName="tokenizer" containerID="cri-o://e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc" gracePeriod=30 Apr 22 18:07:21.616983 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:21.616949 2578 generic.go:358] "Generic (PLEG): container finished" podID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerID="c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702" exitCode=0 Apr 22 18:07:21.617181 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:21.616996 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" event={"ID":"95b53f51-1965-4b1a-b73a-974c40f5ac96","Type":"ContainerDied","Data":"c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702"} Apr 22 18:07:22.555717 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.555696 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:07:22.621321 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.621248 2578 generic.go:358] "Generic (PLEG): container finished" podID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerID="e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc" exitCode=0 Apr 22 18:07:22.621452 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.621329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" event={"ID":"95b53f51-1965-4b1a-b73a-974c40f5ac96","Type":"ContainerDied","Data":"e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc"} Apr 22 18:07:22.621452 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.621337 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" Apr 22 18:07:22.621452 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.621370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld" event={"ID":"95b53f51-1965-4b1a-b73a-974c40f5ac96","Type":"ContainerDied","Data":"c85ebc9596fbea4a22151164f4befd4ba3d6a64bd8f2bbc5ac8952f948ce5e94"} Apr 22 18:07:22.621452 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.621385 2578 scope.go:117] "RemoveContainer" containerID="e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc" Apr 22 18:07:22.629274 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.629259 2578 scope.go:117] "RemoveContainer" containerID="c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702" Apr 22 18:07:22.636132 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.636116 2578 scope.go:117] "RemoveContainer" containerID="c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97" Apr 22 18:07:22.642857 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.642841 2578 scope.go:117] "RemoveContainer" containerID="e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc" Apr 22 18:07:22.643127 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:07:22.643105 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc\": container with ID starting with e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc not found: ID does not exist" containerID="e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc" Apr 22 18:07:22.643197 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.643137 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc"} err="failed to get container status \"e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc\": rpc error: code = NotFound desc = could not find container \"e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc\": container with ID starting with e44b59bbdd73146b0f077e7ba1a8e2ec6b376f8dac52af931cd516d3650547fc not found: ID does not exist" Apr 22 18:07:22.643197 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.643160 2578 scope.go:117] "RemoveContainer" containerID="c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702" Apr 22 18:07:22.643392 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:07:22.643375 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702\": container with ID starting with c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702 not found: ID does not exist" containerID="c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702" Apr 22 18:07:22.643509 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.643399 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702"} err="failed to get container status \"c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702\": rpc error: code = NotFound desc = could not find container \"c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702\": container with ID starting with c19a6cad28ae32a2088d115439ce8f4d13daefc49e1d9f7b6c0f7f0227734702 not found: ID does not exist" Apr 22 18:07:22.643509 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.643414 2578 scope.go:117] "RemoveContainer" containerID="c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97" Apr 22 18:07:22.643614 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:07:22.643602 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97\": container with ID starting with c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97 not found: ID does not exist" containerID="c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97" Apr 22 18:07:22.643651 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.643617 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97"} err="failed to get container status \"c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97\": rpc error: code = NotFound desc = could not find container \"c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97\": container with ID starting with c4b1436b23c3cb519e87e25ab8a5cbbf03d26ceaf2042cdd6392ecb51b721e97 not found: ID does not exist" Apr 22 18:07:22.671913 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.671894 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t56z8\" (UniqueName: \"kubernetes.io/projected/95b53f51-1965-4b1a-b73a-974c40f5ac96-kube-api-access-t56z8\") pod \"95b53f51-1965-4b1a-b73a-974c40f5ac96\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " Apr 22 18:07:22.672005 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.671952 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95b53f51-1965-4b1a-b73a-974c40f5ac96-tls-certs\") pod \"95b53f51-1965-4b1a-b73a-974c40f5ac96\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " Apr 22 18:07:22.672005 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.671975 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-tmp\") pod \"95b53f51-1965-4b1a-b73a-974c40f5ac96\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " Apr 22 18:07:22.672121 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.672020 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-uds\") pod \"95b53f51-1965-4b1a-b73a-974c40f5ac96\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " Apr 22 18:07:22.672121 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.672045 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-kserve-provision-location\") pod \"95b53f51-1965-4b1a-b73a-974c40f5ac96\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " Apr 22 18:07:22.672224 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.672120 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-cache\") pod \"95b53f51-1965-4b1a-b73a-974c40f5ac96\" (UID: \"95b53f51-1965-4b1a-b73a-974c40f5ac96\") " Apr 22 18:07:22.672296 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.672267 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "95b53f51-1965-4b1a-b73a-974c40f5ac96" (UID: "95b53f51-1965-4b1a-b73a-974c40f5ac96"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:22.672417 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.672390 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "95b53f51-1965-4b1a-b73a-974c40f5ac96" (UID: "95b53f51-1965-4b1a-b73a-974c40f5ac96"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:22.672417 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.672404 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "95b53f51-1965-4b1a-b73a-974c40f5ac96" (UID: "95b53f51-1965-4b1a-b73a-974c40f5ac96"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:22.672600 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.672478 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-uds\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:22.672858 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.672831 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "95b53f51-1965-4b1a-b73a-974c40f5ac96" (UID: "95b53f51-1965-4b1a-b73a-974c40f5ac96"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:22.673970 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.673950 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b53f51-1965-4b1a-b73a-974c40f5ac96-kube-api-access-t56z8" (OuterVolumeSpecName: "kube-api-access-t56z8") pod "95b53f51-1965-4b1a-b73a-974c40f5ac96" (UID: "95b53f51-1965-4b1a-b73a-974c40f5ac96"). InnerVolumeSpecName "kube-api-access-t56z8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:07:22.674057 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.673966 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b53f51-1965-4b1a-b73a-974c40f5ac96-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "95b53f51-1965-4b1a-b73a-974c40f5ac96" (UID: "95b53f51-1965-4b1a-b73a-974c40f5ac96"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:07:22.772976 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.772949 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t56z8\" (UniqueName: \"kubernetes.io/projected/95b53f51-1965-4b1a-b73a-974c40f5ac96-kube-api-access-t56z8\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:22.773078 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.772977 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95b53f51-1965-4b1a-b73a-974c40f5ac96-tls-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:22.773078 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.772994 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-tmp\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:22.773078 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.773007 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-kserve-provision-location\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:22.773078 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.773019 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/95b53f51-1965-4b1a-b73a-974c40f5ac96-tokenizer-cache\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:22.944606 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.944578 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld"] Apr 22 18:07:22.949181 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:22.949157 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-bfffbs6fld"] Apr 22 18:07:23.956361 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:23.956328 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" path="/var/lib/kubelet/pods/95b53f51-1965-4b1a-b73a-974c40f5ac96/volumes" Apr 22 18:07:29.056504 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.056474 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs"] Apr 22 18:07:29.056951 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.056864 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerName="main" Apr 22 18:07:29.056951 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.056881 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerName="main" Apr 22 18:07:29.056951 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.056905 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerName="storage-initializer" Apr 22 18:07:29.056951 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.056914 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerName="storage-initializer" Apr 22 18:07:29.056951 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.056932 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerName="tokenizer" Apr 22 18:07:29.056951 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.056941 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerName="tokenizer" Apr 22 18:07:29.057136 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.057042 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerName="tokenizer" Apr 22 18:07:29.057136 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.057054 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="95b53f51-1965-4b1a-b73a-974c40f5ac96" containerName="main" Apr 22 18:07:29.062191 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.062164 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.064554 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.064536 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 22 18:07:29.070306 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.070274 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs"] Apr 22 18:07:29.122699 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.122670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-model-cache\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.122848 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.122710 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.122848 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.122738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-dshm\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.122848 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.122820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-home\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.123124 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.122851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-tls-certs\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.123124 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.122901 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fpjg\" (UniqueName: \"kubernetes.io/projected/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kube-api-access-4fpjg\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.223578 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.223545 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.223733 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.223593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-dshm\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.223733 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.223626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-home\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.223733 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.223656 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-tls-certs\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.223733 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.223712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fpjg\" (UniqueName: \"kubernetes.io/projected/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kube-api-access-4fpjg\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.223958 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.223758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-model-cache\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.224018 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.223993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.224077 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.224025 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-home\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.224171 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.224150 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-model-cache\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.225842 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.225817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-dshm\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.226144 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.226127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-tls-certs\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.236571 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.236552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fpjg\" (UniqueName: \"kubernetes.io/projected/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kube-api-access-4fpjg\") pod \"scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.368478 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.368387 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj"] Apr 22 18:07:29.372307 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.372287 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.373299 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.373279 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:29.375088 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.375051 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-th6jv\"" Apr 22 18:07:29.383843 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.383810 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj"] Apr 22 18:07:29.426366 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.426341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.426513 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.426449 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsnss\" (UniqueName: \"kubernetes.io/projected/de07dc87-da58-4d81-bc63-0bba35f00396-kube-api-access-nsnss\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.426513 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.426498 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.426631 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.426606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.426685 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.426667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.426798 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.426730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de07dc87-da58-4d81-bc63-0bba35f00396-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.498933 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.498905 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs"] Apr 22 18:07:29.501477 ip-10-0-143-54 kubenswrapper[2578]: W0422 18:07:29.501452 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf63ab0b9_bcf5_49c8_9074_359b4ce7b69e.slice/crio-21fbc43330f41358b855358ff0fb7e882dc45f2cf59fd047658f6b2fe7f21427 WatchSource:0}: Error finding container 21fbc43330f41358b855358ff0fb7e882dc45f2cf59fd047658f6b2fe7f21427: Status 404 returned error can't find the container with id 21fbc43330f41358b855358ff0fb7e882dc45f2cf59fd047658f6b2fe7f21427 Apr 22 18:07:29.503162 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.503145 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:07:29.527373 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.527340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.527533 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.527383 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.527533 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.527431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de07dc87-da58-4d81-bc63-0bba35f00396-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.527533 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.527462 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.527533 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.527495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsnss\" (UniqueName: \"kubernetes.io/projected/de07dc87-da58-4d81-bc63-0bba35f00396-kube-api-access-nsnss\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.527533 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.527529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.527867 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.527723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.527867 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.527778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.527867 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.527863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.527987 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.527864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.529728 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.529711 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de07dc87-da58-4d81-bc63-0bba35f00396-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.535522 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.535499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsnss\" (UniqueName: \"kubernetes.io/projected/de07dc87-da58-4d81-bc63-0bba35f00396-kube-api-access-nsnss\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.646094 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.646023 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" event={"ID":"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e","Type":"ContainerStarted","Data":"5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418"} Apr 22 18:07:29.646094 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.646064 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" event={"ID":"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e","Type":"ContainerStarted","Data":"21fbc43330f41358b855358ff0fb7e882dc45f2cf59fd047658f6b2fe7f21427"} Apr 22 18:07:29.700890 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.700860 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:29.849245 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:29.849213 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj"] Apr 22 18:07:29.850733 ip-10-0-143-54 kubenswrapper[2578]: W0422 18:07:29.850700 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde07dc87_da58_4d81_bc63_0bba35f00396.slice/crio-83911620b724ccfc0eda9f2800aac27440e7eecd8dc913a71d12609d1d1dfa01 WatchSource:0}: Error finding container 83911620b724ccfc0eda9f2800aac27440e7eecd8dc913a71d12609d1d1dfa01: Status 404 returned error can't find the container with id 83911620b724ccfc0eda9f2800aac27440e7eecd8dc913a71d12609d1d1dfa01 Apr 22 18:07:30.653935 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:30.653894 2578 generic.go:358] "Generic (PLEG): container finished" podID="de07dc87-da58-4d81-bc63-0bba35f00396" containerID="3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f" exitCode=0 Apr 22 18:07:30.654364 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:30.653976 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" event={"ID":"de07dc87-da58-4d81-bc63-0bba35f00396","Type":"ContainerDied","Data":"3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f"} Apr 22 18:07:30.654364 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:30.654023 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" event={"ID":"de07dc87-da58-4d81-bc63-0bba35f00396","Type":"ContainerStarted","Data":"83911620b724ccfc0eda9f2800aac27440e7eecd8dc913a71d12609d1d1dfa01"} Apr 22 18:07:31.659968 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:31.659926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" event={"ID":"de07dc87-da58-4d81-bc63-0bba35f00396","Type":"ContainerStarted","Data":"97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f"} Apr 22 18:07:31.659968 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:31.659971 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" event={"ID":"de07dc87-da58-4d81-bc63-0bba35f00396","Type":"ContainerStarted","Data":"f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571"} Apr 22 18:07:31.660474 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:31.660187 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:31.682207 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:31.682163 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" podStartSLOduration=2.682151393 podStartE2EDuration="2.682151393s" podCreationTimestamp="2026-04-22 18:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:07:31.679715416 +0000 UTC m=+2000.295807691" watchObservedRunningTime="2026-04-22 18:07:31.682151393 +0000 UTC m=+2000.298243664" Apr 22 18:07:32.272249 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:32.272215 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:07:32.272651 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:32.272620 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="10de64b5-00e7-4b3a-92c8-e8ae2c483188" containerName="main" containerID="cri-o://6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db" gracePeriod=30 Apr 22 18:07:33.143518 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.143496 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:07:33.165969 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.165136 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kserve-provision-location\") pod \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " Apr 22 18:07:33.165969 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.165270 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-home\") pod \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " Apr 22 18:07:33.165969 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.165294 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-dshm\") pod \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " Apr 22 18:07:33.165969 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.165315 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-model-cache\") pod \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " Apr 22 18:07:33.165969 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.165349 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tljg2\" (UniqueName: \"kubernetes.io/projected/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kube-api-access-tljg2\") pod \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " Apr 22 18:07:33.165969 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.165444 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10de64b5-00e7-4b3a-92c8-e8ae2c483188-tls-certs\") pod \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\" (UID: \"10de64b5-00e7-4b3a-92c8-e8ae2c483188\") " Apr 22 18:07:33.166414 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.166089 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-home" (OuterVolumeSpecName: "home") pod "10de64b5-00e7-4b3a-92c8-e8ae2c483188" (UID: "10de64b5-00e7-4b3a-92c8-e8ae2c483188"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:33.166414 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.166189 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-model-cache" (OuterVolumeSpecName: "model-cache") pod "10de64b5-00e7-4b3a-92c8-e8ae2c483188" (UID: "10de64b5-00e7-4b3a-92c8-e8ae2c483188"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:33.169143 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.168815 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-dshm" (OuterVolumeSpecName: "dshm") pod "10de64b5-00e7-4b3a-92c8-e8ae2c483188" (UID: "10de64b5-00e7-4b3a-92c8-e8ae2c483188"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:33.169274 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.169188 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10de64b5-00e7-4b3a-92c8-e8ae2c483188-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "10de64b5-00e7-4b3a-92c8-e8ae2c483188" (UID: "10de64b5-00e7-4b3a-92c8-e8ae2c483188"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:07:33.169754 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.169717 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kube-api-access-tljg2" (OuterVolumeSpecName: "kube-api-access-tljg2") pod "10de64b5-00e7-4b3a-92c8-e8ae2c483188" (UID: "10de64b5-00e7-4b3a-92c8-e8ae2c483188"). InnerVolumeSpecName "kube-api-access-tljg2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:07:33.238825 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.238761 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "10de64b5-00e7-4b3a-92c8-e8ae2c483188" (UID: "10de64b5-00e7-4b3a-92c8-e8ae2c483188"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:33.266339 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.266307 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10de64b5-00e7-4b3a-92c8-e8ae2c483188-tls-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:33.266339 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.266341 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kserve-provision-location\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:33.266558 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.266357 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-home\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:33.266558 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.266368 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-dshm\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:33.266558 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.266378 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10de64b5-00e7-4b3a-92c8-e8ae2c483188-model-cache\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:33.266558 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.266388 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tljg2\" (UniqueName: \"kubernetes.io/projected/10de64b5-00e7-4b3a-92c8-e8ae2c483188-kube-api-access-tljg2\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:07:33.669735 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.669696 2578 generic.go:358] "Generic (PLEG): container finished" podID="10de64b5-00e7-4b3a-92c8-e8ae2c483188" containerID="6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db" exitCode=0 Apr 22 18:07:33.670017 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.669786 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:07:33.670017 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.669795 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"10de64b5-00e7-4b3a-92c8-e8ae2c483188","Type":"ContainerDied","Data":"6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db"} Apr 22 18:07:33.670017 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.669838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"10de64b5-00e7-4b3a-92c8-e8ae2c483188","Type":"ContainerDied","Data":"8469c911fd3570b1acffcffb543f2910d500ba1991420792047585a638a2587a"} Apr 22 18:07:33.670017 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.669866 2578 scope.go:117] "RemoveContainer" containerID="6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db" Apr 22 18:07:33.691717 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.691693 2578 scope.go:117] "RemoveContainer" containerID="756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4" Apr 22 18:07:33.696050 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.696025 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:07:33.701547 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.701522 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:07:33.703895 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.703871 2578 scope.go:117] "RemoveContainer" containerID="6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db" Apr 22 18:07:33.704189 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:07:33.704172 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db\": container with ID starting with 6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db not found: ID does not exist" containerID="6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db" Apr 22 18:07:33.704255 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.704200 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db"} err="failed to get container status \"6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db\": rpc error: code = NotFound desc = could not find container \"6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db\": container with ID starting with 6acec6aba89c59c0abf03f4288d3e52676ec758867accb94a89fda6ebfa497db not found: ID does not exist" Apr 22 18:07:33.704255 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.704219 2578 scope.go:117] "RemoveContainer" containerID="756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4" Apr 22 18:07:33.704528 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:07:33.704506 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4\": container with ID starting with 756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4 not found: ID does not exist" containerID="756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4" Apr 22 18:07:33.704612 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.704549 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4"} err="failed to get container status \"756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4\": rpc error: code = NotFound desc = could not find container \"756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4\": container with ID starting with 756838d74e3e017f06882d81af8883805fc9773eb095f1631813e2dab558c5d4 not found: ID does not exist" Apr 22 18:07:33.956014 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:33.955983 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10de64b5-00e7-4b3a-92c8-e8ae2c483188" path="/var/lib/kubelet/pods/10de64b5-00e7-4b3a-92c8-e8ae2c483188/volumes" Apr 22 18:07:34.675299 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:34.675266 2578 generic.go:358] "Generic (PLEG): container finished" podID="f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" containerID="5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418" exitCode=0 Apr 22 18:07:34.675682 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:34.675337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" event={"ID":"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e","Type":"ContainerDied","Data":"5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418"} Apr 22 18:07:35.680795 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:35.680756 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" event={"ID":"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e","Type":"ContainerStarted","Data":"c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109"} Apr 22 18:07:35.698677 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:35.698614 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" podStartSLOduration=6.698594203 podStartE2EDuration="6.698594203s" podCreationTimestamp="2026-04-22 18:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:07:35.697696503 +0000 UTC m=+2004.313788820" watchObservedRunningTime="2026-04-22 18:07:35.698594203 +0000 UTC m=+2004.314686475" Apr 22 18:07:39.374228 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:39.374189 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:39.374719 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:39.374324 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:39.386564 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:39.386543 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:39.701086 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:39.701010 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:39.701086 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:39.701046 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:39.703781 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:39.703749 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:07:39.708548 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:39.708526 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:07:40.707200 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:07:40.707159 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:08:01.705559 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:01.705527 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:08:03.211471 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.211434 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj"] Apr 22 18:08:03.211947 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.211817 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" containerName="main" containerID="cri-o://f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571" gracePeriod=30 Apr 22 18:08:03.212015 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.211916 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" containerName="tokenizer" containerID="cri-o://97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f" gracePeriod=30 Apr 22 18:08:03.213844 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.213818 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs"] Apr 22 18:08:03.214169 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.214127 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" podUID="f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" containerName="main" containerID="cri-o://c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109" gracePeriod=30 Apr 22 18:08:03.461339 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.461312 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:08:03.615632 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.615596 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fpjg\" (UniqueName: \"kubernetes.io/projected/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kube-api-access-4fpjg\") pod \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " Apr 22 18:08:03.615632 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.615649 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kserve-provision-location\") pod \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " Apr 22 18:08:03.615918 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.615670 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-model-cache\") pod \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " Apr 22 18:08:03.615918 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.615838 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-home\") pod \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " Apr 22 18:08:03.615918 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.615888 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-model-cache" (OuterVolumeSpecName: "model-cache") pod "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" (UID: "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:03.615918 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.615900 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-tls-certs\") pod \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " Apr 22 18:08:03.616193 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.615941 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-dshm\") pod \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\" (UID: \"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e\") " Apr 22 18:08:03.616193 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.616146 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-home" (OuterVolumeSpecName: "home") pod "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" (UID: "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:03.616316 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.616286 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-model-cache\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:03.616316 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.616312 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-home\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:03.618040 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.618007 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-dshm" (OuterVolumeSpecName: "dshm") pod "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" (UID: "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:03.618318 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.618288 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" (UID: "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:08:03.618318 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.618297 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kube-api-access-4fpjg" (OuterVolumeSpecName: "kube-api-access-4fpjg") pod "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" (UID: "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e"). InnerVolumeSpecName "kube-api-access-4fpjg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:08:03.678663 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.678619 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" (UID: "f63ab0b9-bcf5-49c8-9074-359b4ce7b69e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:03.716925 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.716892 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-dshm\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:03.716925 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.716921 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4fpjg\" (UniqueName: \"kubernetes.io/projected/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kube-api-access-4fpjg\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:03.717101 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.716932 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-kserve-provision-location\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:03.717101 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.716945 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e-tls-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:03.778941 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.778903 2578 generic.go:358] "Generic (PLEG): container finished" podID="de07dc87-da58-4d81-bc63-0bba35f00396" containerID="f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571" exitCode=0 Apr 22 18:08:03.779115 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.778979 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" event={"ID":"de07dc87-da58-4d81-bc63-0bba35f00396","Type":"ContainerDied","Data":"f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571"} Apr 22 18:08:03.780371 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.780349 2578 generic.go:358] "Generic (PLEG): container finished" podID="f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" containerID="c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109" exitCode=0 Apr 22 18:08:03.780527 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.780409 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" Apr 22 18:08:03.780527 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.780432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" event={"ID":"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e","Type":"ContainerDied","Data":"c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109"} Apr 22 18:08:03.780527 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.780462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs" event={"ID":"f63ab0b9-bcf5-49c8-9074-359b4ce7b69e","Type":"ContainerDied","Data":"21fbc43330f41358b855358ff0fb7e882dc45f2cf59fd047658f6b2fe7f21427"} Apr 22 18:08:03.780527 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.780482 2578 scope.go:117] "RemoveContainer" containerID="c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109" Apr 22 18:08:03.789987 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.789971 2578 scope.go:117] "RemoveContainer" containerID="5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418" Apr 22 18:08:03.802401 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.802376 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs"] Apr 22 18:08:03.804798 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.804770 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d76fcdd96-rf9bs"] Apr 22 18:08:03.805625 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.805610 2578 scope.go:117] "RemoveContainer" containerID="c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109" Apr 22 18:08:03.805898 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:08:03.805877 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109\": container with ID starting with c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109 not found: ID does not exist" containerID="c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109" Apr 22 18:08:03.805953 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.805912 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109"} err="failed to get container status \"c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109\": rpc error: code = NotFound desc = could not find container \"c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109\": container with ID starting with c37f752affeba3df37c23286c08d0a5f4bd871cd85ba6ca1638bccc7ac70d109 not found: ID does not exist" Apr 22 18:08:03.805953 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.805930 2578 scope.go:117] "RemoveContainer" containerID="5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418" Apr 22 18:08:03.806225 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:08:03.806201 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418\": container with ID starting with 5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418 not found: ID does not exist" containerID="5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418" Apr 22 18:08:03.806347 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.806227 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418"} err="failed to get container status \"5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418\": rpc error: code = NotFound desc = could not find container \"5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418\": container with ID starting with 5f1d02bd518a588c0d4b00d6f8fb4228dbecd7467a215c1a5664cae29bd99418 not found: ID does not exist" Apr 22 18:08:03.957538 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:03.957448 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" path="/var/lib/kubelet/pods/f63ab0b9-bcf5-49c8-9074-359b4ce7b69e/volumes" Apr 22 18:08:04.471338 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.471319 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:08:04.624384 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.624348 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsnss\" (UniqueName: \"kubernetes.io/projected/de07dc87-da58-4d81-bc63-0bba35f00396-kube-api-access-nsnss\") pod \"de07dc87-da58-4d81-bc63-0bba35f00396\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " Apr 22 18:08:04.624384 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.624387 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-tmp\") pod \"de07dc87-da58-4d81-bc63-0bba35f00396\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " Apr 22 18:08:04.624640 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.624438 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-kserve-provision-location\") pod \"de07dc87-da58-4d81-bc63-0bba35f00396\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " Apr 22 18:08:04.624640 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.624461 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-uds\") pod \"de07dc87-da58-4d81-bc63-0bba35f00396\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " Apr 22 18:08:04.624640 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.624574 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de07dc87-da58-4d81-bc63-0bba35f00396-tls-certs\") pod \"de07dc87-da58-4d81-bc63-0bba35f00396\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " Apr 22 18:08:04.624640 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.624611 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-cache\") pod \"de07dc87-da58-4d81-bc63-0bba35f00396\" (UID: \"de07dc87-da58-4d81-bc63-0bba35f00396\") " Apr 22 18:08:04.624863 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.624715 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "de07dc87-da58-4d81-bc63-0bba35f00396" (UID: "de07dc87-da58-4d81-bc63-0bba35f00396"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:04.624863 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.624786 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "de07dc87-da58-4d81-bc63-0bba35f00396" (UID: "de07dc87-da58-4d81-bc63-0bba35f00396"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:04.624971 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.624936 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-tmp\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:04.625024 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.624992 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-uds\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:04.625024 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.625005 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "de07dc87-da58-4d81-bc63-0bba35f00396" (UID: "de07dc87-da58-4d81-bc63-0bba35f00396"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:04.625397 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.625371 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "de07dc87-da58-4d81-bc63-0bba35f00396" (UID: "de07dc87-da58-4d81-bc63-0bba35f00396"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:04.626520 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.626501 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de07dc87-da58-4d81-bc63-0bba35f00396-kube-api-access-nsnss" (OuterVolumeSpecName: "kube-api-access-nsnss") pod "de07dc87-da58-4d81-bc63-0bba35f00396" (UID: "de07dc87-da58-4d81-bc63-0bba35f00396"). InnerVolumeSpecName "kube-api-access-nsnss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:08:04.626626 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.626609 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de07dc87-da58-4d81-bc63-0bba35f00396-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "de07dc87-da58-4d81-bc63-0bba35f00396" (UID: "de07dc87-da58-4d81-bc63-0bba35f00396"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:08:04.725497 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.725405 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de07dc87-da58-4d81-bc63-0bba35f00396-tls-certs\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:04.725497 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.725471 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-tokenizer-cache\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:04.725497 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.725481 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsnss\" (UniqueName: \"kubernetes.io/projected/de07dc87-da58-4d81-bc63-0bba35f00396-kube-api-access-nsnss\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:04.725497 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.725490 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de07dc87-da58-4d81-bc63-0bba35f00396-kserve-provision-location\") on node \"ip-10-0-143-54.ec2.internal\" DevicePath \"\"" Apr 22 18:08:04.785391 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.785365 2578 generic.go:358] "Generic (PLEG): container finished" podID="de07dc87-da58-4d81-bc63-0bba35f00396" containerID="97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f" exitCode=0 Apr 22 18:08:04.785541 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.785457 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" Apr 22 18:08:04.785541 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.785462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" event={"ID":"de07dc87-da58-4d81-bc63-0bba35f00396","Type":"ContainerDied","Data":"97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f"} Apr 22 18:08:04.785541 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.785502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj" event={"ID":"de07dc87-da58-4d81-bc63-0bba35f00396","Type":"ContainerDied","Data":"83911620b724ccfc0eda9f2800aac27440e7eecd8dc913a71d12609d1d1dfa01"} Apr 22 18:08:04.785541 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.785520 2578 scope.go:117] "RemoveContainer" containerID="97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f" Apr 22 18:08:04.794104 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.794086 2578 scope.go:117] "RemoveContainer" containerID="f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571" Apr 22 18:08:04.803202 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.803187 2578 scope.go:117] "RemoveContainer" containerID="3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f" Apr 22 18:08:04.810287 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.810266 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj"] Apr 22 18:08:04.811097 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.811082 2578 scope.go:117] "RemoveContainer" containerID="97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f" Apr 22 18:08:04.811354 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:08:04.811334 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f\": container with ID starting with 97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f not found: ID does not exist" containerID="97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f" Apr 22 18:08:04.811403 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.811364 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f"} err="failed to get container status \"97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f\": rpc error: code = NotFound desc = could not find container \"97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f\": container with ID starting with 97a31492ec836ba76a0478431c4a7d0471e4578e0f01a4e3cc7f9a837a1b610f not found: ID does not exist" Apr 22 18:08:04.811403 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.811384 2578 scope.go:117] "RemoveContainer" containerID="f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571" Apr 22 18:08:04.811755 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:08:04.811738 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571\": container with ID starting with f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571 not found: ID does not exist" containerID="f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571" Apr 22 18:08:04.811807 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.811759 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571"} err="failed to get container status \"f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571\": rpc error: code = NotFound desc = could not find container \"f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571\": container with ID starting with f417b58c71f8f60bd987e32ca518b5fc7d9a78dc37360570ec10b69e1a0f2571 not found: ID does not exist" Apr 22 18:08:04.811807 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.811775 2578 scope.go:117] "RemoveContainer" containerID="3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f" Apr 22 18:08:04.811988 ip-10-0-143-54 kubenswrapper[2578]: E0422 18:08:04.811969 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f\": container with ID starting with 3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f not found: ID does not exist" containerID="3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f" Apr 22 18:08:04.812042 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.811997 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f"} err="failed to get container status \"3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f\": rpc error: code = NotFound desc = could not find container \"3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f\": container with ID starting with 3edf5b21c150638553585ca6474544549d9cbb30bcd5d28c680ce3f9800da79f not found: ID does not exist" Apr 22 18:08:04.817044 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:04.817026 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7b455nndnj"] Apr 22 18:08:05.955966 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:05.955933 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" path="/var/lib/kubelet/pods/de07dc87-da58-4d81-bc63-0bba35f00396/volumes" Apr 22 18:08:31.791087 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:31.791011 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-f9jrl_47da7e04-76fe-40d1-ae27-6d0cd06117e7/istio-proxy/0.log" Apr 22 18:08:31.817692 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:31.817673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79c986ff86-dmsc9_aa98bb13-b13a-434c-8c1e-4bdce39c5b4a/router/0.log" Apr 22 18:08:32.581335 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:32.581298 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-f9jrl_47da7e04-76fe-40d1-ae27-6d0cd06117e7/istio-proxy/0.log" Apr 22 18:08:32.598844 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:32.598814 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79c986ff86-dmsc9_aa98bb13-b13a-434c-8c1e-4bdce39c5b4a/router/0.log" Apr 22 18:08:33.325695 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:33.325666 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-8sj8h_66200c04-ae3c-49cb-b6ed-9a4136c0bd46/authorino/0.log" Apr 22 18:08:33.350473 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:33.350452 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-7prvb_c5620ed5-9eba-4be3-a0c8-2f2510b07893/manager/0.log" Apr 22 18:08:33.361598 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:33.361580 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-zfhk4_5f5c8748-e846-47b6-9305-a3dd7cf0ccb1/kuadrant-console-plugin/0.log" Apr 22 18:08:38.619937 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:38.619904 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-s4xq4_8086e0dd-7b2c-4fb8-bb9c-e3554698418a/global-pull-secret-syncer/0.log" Apr 22 18:08:38.769914 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:38.769887 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jb9vl_6f56fd06-5b7e-4a95-b2ba-3604f018b678/konnectivity-agent/0.log" Apr 22 18:08:38.841261 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:38.841234 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-54.ec2.internal_1346330c188515bccf6ba5d0ec452e1d/haproxy/0.log" Apr 22 18:08:43.245385 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:43.245359 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-8sj8h_66200c04-ae3c-49cb-b6ed-9a4136c0bd46/authorino/0.log" Apr 22 18:08:43.295645 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:43.295612 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-7prvb_c5620ed5-9eba-4be3-a0c8-2f2510b07893/manager/0.log" Apr 22 18:08:43.324703 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:43.324682 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-zfhk4_5f5c8748-e846-47b6-9305-a3dd7cf0ccb1/kuadrant-console-plugin/0.log" Apr 22 18:08:44.730479 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:44.730447 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kkhkg_f4641e6a-9871-4f14-bec6-666f704d5f1d/kube-state-metrics/0.log" Apr 22 18:08:44.749191 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:44.749168 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kkhkg_f4641e6a-9871-4f14-bec6-666f704d5f1d/kube-rbac-proxy-main/0.log" Apr 22 18:08:44.768195 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:44.768177 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kkhkg_f4641e6a-9871-4f14-bec6-666f704d5f1d/kube-rbac-proxy-self/0.log" Apr 22 18:08:44.795794 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:44.795773 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-59fbcf48df-f4l4b_05b049bd-566e-4b9b-b453-ee120360ea21/metrics-server/0.log" Apr 22 18:08:44.820042 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:44.820021 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-tcwpq_67c1a3d6-0553-48aa-997e-5b95f3a097b7/monitoring-plugin/0.log" Apr 22 18:08:44.914844 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:44.914823 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cf2sn_aaa1eb03-977d-435e-99be-1036b84441ec/node-exporter/0.log" Apr 22 18:08:44.938061 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:44.938039 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cf2sn_aaa1eb03-977d-435e-99be-1036b84441ec/kube-rbac-proxy/0.log" Apr 22 18:08:44.965046 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:44.965023 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cf2sn_aaa1eb03-977d-435e-99be-1036b84441ec/init-textfile/0.log" Apr 22 18:08:45.256371 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:45.256326 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_117ccd15-9480-4395-b6fd-e329836c6891/prometheus/0.log" Apr 22 18:08:45.292241 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:45.292220 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_117ccd15-9480-4395-b6fd-e329836c6891/config-reloader/0.log" Apr 22 18:08:45.327199 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:45.327173 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_117ccd15-9480-4395-b6fd-e329836c6891/thanos-sidecar/0.log" Apr 22 18:08:45.360105 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:45.360089 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_117ccd15-9480-4395-b6fd-e329836c6891/kube-rbac-proxy-web/0.log" Apr 22 18:08:45.387087 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:45.387063 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_117ccd15-9480-4395-b6fd-e329836c6891/kube-rbac-proxy/0.log" Apr 22 18:08:45.420000 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:45.419983 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_117ccd15-9480-4395-b6fd-e329836c6891/kube-rbac-proxy-thanos/0.log" Apr 22 18:08:45.445277 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:45.445262 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_117ccd15-9480-4395-b6fd-e329836c6891/init-config-reloader/0.log" Apr 22 18:08:45.484451 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:45.484434 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ql4f8_a7f668ec-6531-4017-aa81-e044998d431f/prometheus-operator/0.log" Apr 22 18:08:45.502365 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:45.502346 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ql4f8_a7f668ec-6531-4017-aa81-e044998d431f/kube-rbac-proxy/0.log" Apr 22 18:08:45.525259 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:45.525239 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-9pr5w_ca1b1673-b38d-41ee-ab4e-cb1e70bd3168/prometheus-operator-admission-webhook/0.log" Apr 22 18:08:47.404274 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404239 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w"] Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404582 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" containerName="storage-initializer" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404593 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" containerName="storage-initializer" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404603 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" containerName="main" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404609 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" containerName="main" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404615 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" containerName="main" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404621 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" containerName="main" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404628 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10de64b5-00e7-4b3a-92c8-e8ae2c483188" containerName="storage-initializer" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404633 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="10de64b5-00e7-4b3a-92c8-e8ae2c483188" containerName="storage-initializer" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404644 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" containerName="storage-initializer" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404649 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" containerName="storage-initializer" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404660 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" containerName="tokenizer" Apr 22 18:08:47.404659 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404666 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" containerName="tokenizer" Apr 22 18:08:47.405159 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404683 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10de64b5-00e7-4b3a-92c8-e8ae2c483188" containerName="main" Apr 22 18:08:47.405159 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404691 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="10de64b5-00e7-4b3a-92c8-e8ae2c483188" containerName="main" Apr 22 18:08:47.405159 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404777 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" containerName="tokenizer" Apr 22 18:08:47.405159 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404787 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="de07dc87-da58-4d81-bc63-0bba35f00396" containerName="main" Apr 22 18:08:47.405159 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404794 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f63ab0b9-bcf5-49c8-9074-359b4ce7b69e" containerName="main" Apr 22 18:08:47.405159 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.404800 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="10de64b5-00e7-4b3a-92c8-e8ae2c483188" containerName="main" Apr 22 18:08:47.407762 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.407739 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.410039 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.410017 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h8t2w\"/\"openshift-service-ca.crt\"" Apr 22 18:08:47.410131 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.410065 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h8t2w\"/\"default-dockercfg-r2vhp\"" Apr 22 18:08:47.411048 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.411034 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h8t2w\"/\"kube-root-ca.crt\"" Apr 22 18:08:47.414946 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.414635 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w"] Apr 22 18:08:47.468538 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.468506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-lib-modules\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.468700 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.468547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6tr5\" (UniqueName: \"kubernetes.io/projected/e93f386f-7907-45de-b549-bd2362fe6d4d-kube-api-access-h6tr5\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.468700 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.468574 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-proc\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.468700 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.468649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-podres\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.468809 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.468699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-sys\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.569251 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.569210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-podres\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.569462 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.569264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-sys\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.569462 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.569308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-lib-modules\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.569462 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.569337 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6tr5\" (UniqueName: \"kubernetes.io/projected/e93f386f-7907-45de-b549-bd2362fe6d4d-kube-api-access-h6tr5\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.569462 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.569389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-podres\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.569686 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.569479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-proc\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.569686 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.569520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-proc\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.569686 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.569484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-lib-modules\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.569686 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.569484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93f386f-7907-45de-b549-bd2362fe6d4d-sys\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.577686 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.577662 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6tr5\" (UniqueName: \"kubernetes.io/projected/e93f386f-7907-45de-b549-bd2362fe6d4d-kube-api-access-h6tr5\") pod \"perf-node-gather-daemonset-r4p2w\" (UID: \"e93f386f-7907-45de-b549-bd2362fe6d4d\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.718695 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.718608 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.836073 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.836047 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w"] Apr 22 18:08:47.839022 ip-10-0-143-54 kubenswrapper[2578]: W0422 18:08:47.838999 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode93f386f_7907_45de_b549_bd2362fe6d4d.slice/crio-592956c0adca8867b86af94af6e4f45e6f1ffa500eb6ff6fe7e381e86222f4a1 WatchSource:0}: Error finding container 592956c0adca8867b86af94af6e4f45e6f1ffa500eb6ff6fe7e381e86222f4a1: Status 404 returned error can't find the container with id 592956c0adca8867b86af94af6e4f45e6f1ffa500eb6ff6fe7e381e86222f4a1 Apr 22 18:08:47.931114 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.931090 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" event={"ID":"e93f386f-7907-45de-b549-bd2362fe6d4d","Type":"ContainerStarted","Data":"e9ba6ba87098d4dd49d9638e1a55f2760e68bfe96dd7097389a0923dfb5738b9"} Apr 22 18:08:47.931224 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.931120 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" event={"ID":"e93f386f-7907-45de-b549-bd2362fe6d4d","Type":"ContainerStarted","Data":"592956c0adca8867b86af94af6e4f45e6f1ffa500eb6ff6fe7e381e86222f4a1"} Apr 22 18:08:47.931224 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.931143 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:47.948225 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:47.948182 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" podStartSLOduration=0.94817111 podStartE2EDuration="948.17111ms" podCreationTimestamp="2026-04-22 18:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:08:47.945694038 +0000 UTC m=+2076.561786310" watchObservedRunningTime="2026-04-22 18:08:47.94817111 +0000 UTC m=+2076.564263417" Apr 22 18:08:48.334702 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:48.334672 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-c2n2f_c891add1-515f-45ef-bfb6-7d42a222721b/volume-data-source-validator/0.log" Apr 22 18:08:49.038630 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:49.038603 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jnl2w_03e17850-8d7a-4344-ad3f-eeff8ff1097d/dns/0.log" Apr 22 18:08:49.059391 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:49.059357 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jnl2w_03e17850-8d7a-4344-ad3f-eeff8ff1097d/kube-rbac-proxy/0.log" Apr 22 18:08:49.149170 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:49.149148 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cznmj_5b0d2e46-ad7d-48ff-9531-6e09d8c07bc4/dns-node-resolver/0.log" Apr 22 18:08:49.587628 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:49.587602 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-875dc5ff8-svc6l_549cc9d6-4255-4c2c-afbf-754b2d91dcb4/registry/0.log" Apr 22 18:08:49.649826 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:49.649804 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hspjm_26bfba8d-71c2-4440-8bc9-7e5759e58f9f/node-ca/0.log" Apr 22 18:08:50.466845 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:50.466817 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-f9jrl_47da7e04-76fe-40d1-ae27-6d0cd06117e7/istio-proxy/0.log" Apr 22 18:08:50.490238 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:50.490213 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79c986ff86-dmsc9_aa98bb13-b13a-434c-8c1e-4bdce39c5b4a/router/0.log" Apr 22 18:08:50.938111 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:50.938080 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cbffr_06b47e11-143b-4f36-b4a4-c16daaed8856/serve-healthcheck-canary/0.log" Apr 22 18:08:51.459140 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:51.459112 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b68fk_51a32cd1-e242-4714-a778-917c371dfecb/kube-rbac-proxy/0.log" Apr 22 18:08:51.479271 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:51.479242 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b68fk_51a32cd1-e242-4714-a778-917c371dfecb/exporter/0.log" Apr 22 18:08:51.498700 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:51.498679 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b68fk_51a32cd1-e242-4714-a778-917c371dfecb/extractor/0.log" Apr 22 18:08:53.944224 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:53.944195 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-r4p2w" Apr 22 18:08:54.077927 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:54.077899 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-959c974c-b9kdg_47424af5-704f-4afe-9c94-1c3dcbd3c65d/manager/0.log" Apr 22 18:08:54.619627 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:54.619587 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84ffddfb66-gj9hs_990984ef-49ee-48d9-b76d-90212467c4aa/manager/0.log" Apr 22 18:08:54.702799 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:54.702769 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-2wgbj_a8db03a9-d980-46d9-a984-6c51d78956f2/server/0.log" Apr 22 18:08:59.823518 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:59.823471 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4gpk5_de483a7f-5ee9-4932-9834-cd4e6a512d00/kube-storage-version-migrator-operator/1.log" Apr 22 18:08:59.824256 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:08:59.824242 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4gpk5_de483a7f-5ee9-4932-9834-cd4e6a512d00/kube-storage-version-migrator-operator/0.log" Apr 22 18:09:01.083881 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:01.083853 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z45jm_9db6f1f0-3473-4256-b588-7220e67210f2/kube-multus-additional-cni-plugins/0.log" Apr 22 18:09:01.122041 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:01.122011 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z45jm_9db6f1f0-3473-4256-b588-7220e67210f2/egress-router-binary-copy/0.log" Apr 22 18:09:01.145218 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:01.145194 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z45jm_9db6f1f0-3473-4256-b588-7220e67210f2/cni-plugins/0.log" Apr 22 18:09:01.165081 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:01.165063 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z45jm_9db6f1f0-3473-4256-b588-7220e67210f2/bond-cni-plugin/0.log" Apr 22 18:09:01.184929 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:01.184911 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z45jm_9db6f1f0-3473-4256-b588-7220e67210f2/routeoverride-cni/0.log" Apr 22 18:09:01.205063 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:01.205040 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z45jm_9db6f1f0-3473-4256-b588-7220e67210f2/whereabouts-cni-bincopy/0.log" Apr 22 18:09:01.230493 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:01.230474 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z45jm_9db6f1f0-3473-4256-b588-7220e67210f2/whereabouts-cni/0.log" Apr 22 18:09:01.332567 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:01.332541 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tg2s7_6c5f2f61-7b2a-45e4-b641-854237b19df4/kube-multus/0.log" Apr 22 18:09:01.437462 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:01.437394 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fd4w6_81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8/network-metrics-daemon/0.log" Apr 22 18:09:01.457366 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:01.457346 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fd4w6_81a52de8-56c0-4e61-b0af-a6b6fa4bbfe8/kube-rbac-proxy/0.log" Apr 22 18:09:02.231192 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:02.231165 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-controller/0.log" Apr 22 18:09:02.247331 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:02.247310 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/0.log" Apr 22 18:09:02.256159 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:02.256141 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovn-acl-logging/1.log" Apr 22 18:09:02.274741 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:02.274724 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/kube-rbac-proxy-node/0.log" Apr 22 18:09:02.293923 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:02.293902 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:09:02.309880 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:02.309864 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/northd/0.log" Apr 22 18:09:02.329268 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:02.329248 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/nbdb/0.log" Apr 22 18:09:02.348585 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:02.348564 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/sbdb/0.log" Apr 22 18:09:02.444597 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:02.444566 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b8mb_3ca8f0dc-e6b3-42f9-bbfe-f1e0f1d31307/ovnkube-controller/0.log" Apr 22 18:09:04.125218 ip-10-0-143-54 kubenswrapper[2578]: I0422 18:09:04.125192 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-69d82_8d8d4bc2-3b24-4f90-96c4-3b43bade1eb6/network-check-target-container/0.log"