Apr 24 21:25:21.173219 ip-10-0-142-242 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:25:21.173229 ip-10-0-142-242 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:25:21.173236 ip-10-0-142-242 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:25:21.173450 ip-10-0-142-242 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:25:31.266944 ip-10-0-142-242 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:25:31.266959 ip-10-0-142-242 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 208a7b6591e84eb2872f1b2b02aa28a5 -- Apr 24 21:27:41.797888 ip-10-0-142-242 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:42.184818 ip-10-0-142-242 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:42.184818 ip-10-0-142-242 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:42.184818 ip-10-0-142-242 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:42.184818 ip-10-0-142-242 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:42.184818 ip-10-0-142-242 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:42.186454 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.186362 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:42.188684 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188669 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:42.188684 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188684 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188688 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188691 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188695 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188697 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188700 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188704 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188707 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188710 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188713 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188716 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188720 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188723 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188725 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188728 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188731 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188734 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188736 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188739 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188741 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:42.188745 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188744 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188747 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188750 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188753 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188756 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188759 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188761 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188764 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188767 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188769 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188772 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188774 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188776 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188781 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188784 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188787 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188790 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188793 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188795 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188798 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:42.189217 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188801 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188804 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188807 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188811 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188815 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188822 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188825 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188827 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188830 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188833 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188836 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188838 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188841 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188843 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188846 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188849 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188852 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188855 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188857 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188860 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:42.189716 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188863 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188865 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188868 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188870 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188873 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188876 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188879 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188882 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188886 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188888 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188891 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188893 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188896 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188899 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188902 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188904 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188907 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188909 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188912 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188915 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:42.190201 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188918 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188920 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188923 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188925 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.188928 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189280 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189285 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189288 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189291 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189294 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189297 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189299 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189302 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189305 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189307 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189310 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189312 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189315 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189318 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:42.190694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189321 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189323 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189326 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189329 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189331 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189334 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189336 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189339 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189342 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189344 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189347 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189349 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189352 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189355 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189358 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189361 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189364 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189367 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189370 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189373 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:42.191200 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189375 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189378 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189380 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189383 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189386 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189388 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189398 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189400 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189403 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189406 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189409 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189412 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189414 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189417 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189420 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189422 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189425 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189428 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189431 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189435 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:42.191703 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189439 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189442 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189445 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189448 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189451 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189453 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189456 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189459 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189461 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189464 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189467 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189470 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189474 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189477 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189479 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189482 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189484 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189487 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189489 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:42.192196 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189492 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189494 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189497 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189499 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189502 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189508 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189510 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189513 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189515 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189518 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189520 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189523 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.189525 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189613 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189620 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189625 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189630 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189634 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189637 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189642 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189646 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:42.192673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189649 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189652 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189656 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189659 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189662 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189665 2567 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189668 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189671 2567 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189674 2567 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189677 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189680 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189685 2567 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189687 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189690 2567 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189693 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189697 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189701 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189704 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189707 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189711 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189718 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189721 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189724 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189727 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189730 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:42.193179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189735 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189738 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189741 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189744 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189747 2567 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189750 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189754 2567 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189757 2567 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189760 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189764 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189767 2567 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189771 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189774 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189777 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189780 2567 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189783 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189786 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189789 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189792 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189794 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189797 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189800 2567 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189804 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189807 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189810 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:42.193797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189813 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189816 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189820 2567 flags.go:64] FLAG: --help="false" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189823 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189826 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189829 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189832 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189836 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189841 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189844 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189846 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189849 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189852 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189855 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189858 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189861 2567 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189864 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189867 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189870 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189886 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189890 2567 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189893 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189897 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189900 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:42.194491 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189906 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189909 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189912 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189915 2567 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189918 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189921 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189924 2567 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189927 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189931 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189934 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189939 2567 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189942 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189945 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189948 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189951 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189954 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189958 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189961 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189968 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189971 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189974 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189977 2567 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189980 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:42.195139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189986 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189989 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189992 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189995 2567 flags.go:64] FLAG: --port="10250" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.189998 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190001 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c48bb9bccc41e923" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190004 2567 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190007 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190010 2567 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190013 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190015 2567 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190019 2567 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190022 2567 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190024 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190027 2567 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190031 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190034 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190037 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190040 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190043 2567 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190046 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190049 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190052 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190055 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190058 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190062 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:42.195715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190065 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190069 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190071 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190074 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190077 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190080 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190083 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190086 2567 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190089 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190094 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190097 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190100 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190104 2567 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190107 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190110 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190113 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190116 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190119 2567 flags.go:64] FLAG: --v="2" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190123 2567 flags.go:64] FLAG: --version="false" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190126 2567 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190130 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190134 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190221 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190226 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:42.196366 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190230 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190233 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190236 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190239 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190242 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190245 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190247 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190251 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190254 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190256 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190259 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190261 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190264 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190267 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190284 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190288 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190292 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190296 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190300 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190303 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:42.196957 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190306 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190309 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190312 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190315 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190317 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190320 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190323 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190325 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190328 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190330 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190334 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190337 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190339 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190342 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190345 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190348 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190351 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190354 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190356 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:42.197467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190361 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190364 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190366 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190369 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190371 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190374 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190376 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190380 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190383 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190386 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190389 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190392 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190394 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190397 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190400 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190403 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190406 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190409 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190412 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190414 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:42.197944 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190417 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190420 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190422 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190426 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190429 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190431 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190434 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190436 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190439 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190442 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190445 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190447 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190452 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190454 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190457 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190460 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190462 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190465 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190467 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190470 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:42.198432 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190472 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190475 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190477 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190480 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.190483 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.190490 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.196785 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.196895 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196942 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196947 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196951 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196954 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196958 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196962 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196964 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:42.198978 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196967 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196970 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196973 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196975 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196978 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196981 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196983 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196986 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196989 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196991 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196994 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196997 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.196999 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197002 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197004 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197007 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197009 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197012 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197015 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:42.199349 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197017 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197022 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197026 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197029 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197032 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197037 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197040 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197042 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197045 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197047 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197050 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197053 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197055 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197058 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197060 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197063 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197065 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197068 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197070 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197073 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:42.199862 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197076 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197078 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197081 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197083 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197086 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197088 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197091 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197093 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197096 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197099 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197101 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197104 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197106 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197109 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197111 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197114 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197117 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197120 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197123 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197127 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:42.200353 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197131 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197134 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197136 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197139 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197142 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197144 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197147 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197149 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197152 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197154 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197157 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197159 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197162 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197164 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197167 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197169 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197172 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197174 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197176 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:42.200861 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197179 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.197184 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197280 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197284 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197287 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197290 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197293 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197296 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197299 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197302 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197305 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197308 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197311 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197314 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197317 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:42.201316 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197320 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197322 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197325 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197327 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197330 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197333 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197335 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197338 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197340 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197343 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197345 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197348 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197350 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197353 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197355 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197358 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197360 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197363 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197365 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197368 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197370 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:42.201694 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197373 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197375 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197378 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197380 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197383 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197385 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197388 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197391 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197393 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197396 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197399 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197401 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197404 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197406 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197409 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197411 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197414 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197417 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197419 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:42.202212 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197422 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197424 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197427 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197429 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197432 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197434 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197437 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197439 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197442 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197444 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197447 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197449 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197452 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197454 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197458 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197461 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197464 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197467 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197470 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:42.202759 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197473 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197476 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197480 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197483 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197486 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197489 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197491 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197494 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197496 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197499 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197502 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197504 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197506 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:42.197509 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.197514 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:42.203247 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.198169 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:42.203625 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.200048 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:42.203625 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.200915 2567 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:42.203625 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.201009 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:42.203625 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.202247 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:42.224254 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.224236 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:42.228758 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.228735 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:42.240216 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.240198 2567 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:42.245403 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.245388 2567 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:42.248595 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.248570 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:42.253283 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.253263 2567 fs.go:135] Filesystem UUIDs: map[383543e0-a402-42c0-aad5-ebf024da1908:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a9ed7545-431f-4a8f-8c3d-2ced131485d5:/dev/nvme0n1p3] Apr 24 21:27:42.253369 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.253282 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:42.258705 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.258596 2567 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:42.256866918 +0000 UTC m=+0.355279422 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102673 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec201a94601d2f2e9613dd9cc54f1cc0 SystemUUID:ec201a94-601d-2f2e-9613-dd9cc54f1cc0 BootID:208a7b65-91e8-4eb2-872f-1b2b02aa28a5 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e2:3b:b2:09:41 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e2:3b:b2:09:41 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:04:bd:d3:d7:a4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:42.258705 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.258696 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:42.258826 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.258809 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:42.259723 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.259698 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:42.259864 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.259726 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-242.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:42.259917 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.259873 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:42.259917 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.259886 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:42.259917 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.259913 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:42.260041 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.260025 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:42.260616 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.260602 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:42.261492 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.261482 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:42.261611 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.261602 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:42.263942 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.263931 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:42.263978 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.263950 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:42.263978 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.263962 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:42.263978 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.263970 2567 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:42.264090 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.263980 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:42.264885 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.264873 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:42.264931 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.264894 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:42.267910 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.267886 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:42.269843 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.269825 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:42.271457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271441 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:42.271527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271464 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:42.271527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271471 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:42.271527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271476 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:42.271527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271482 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:42.271527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271489 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:42.271527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271498 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:42.271527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271506 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:42.271527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271521 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:42.271527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271527 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:42.271796 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271545 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:42.271796 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.271554 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:42.272997 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.272983 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:42.272997 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.272993 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:42.276424 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.276408 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:42.276532 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.276443 2567 server.go:1295] "Started kubelet" Apr 24 21:27:42.276604 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.276530 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:42.276604 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.276537 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:42.276702 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.276609 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:42.277204 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.277181 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-242.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:27:42.277296 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.277256 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-242.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:42.277280 ip-10-0-142-242 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:42.277438 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.277312 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:42.277829 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.277685 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:42.277829 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.277824 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:42.282778 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.281972 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-242.ec2.internal.18a9682b18636609 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-242.ec2.internal,UID:ip-10-0-142-242.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-242.ec2.internal,},FirstTimestamp:2026-04-24 21:27:42.276421129 +0000 UTC m=+0.374833633,LastTimestamp:2026-04-24 21:27:42.276421129 +0000 UTC m=+0.374833633,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-242.ec2.internal,}" Apr 24 21:27:42.283053 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.283024 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:42.283473 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.283452 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:42.285499 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.285479 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:42.285653 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.285637 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:42.285713 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.285646 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gbfsn" Apr 24 21:27:42.285713 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.285660 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:42.285713 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.285645 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:42.285847 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.285748 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:42.285847 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.285756 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:42.286007 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.285985 2567 factory.go:55] Registering systemd factory Apr 24 21:27:42.286104 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.286062 2567 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:42.286273 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.286247 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:42.286549 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.286537 2567 factory.go:153] Registering CRI-O factory Apr 24 21:27:42.286627 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.286553 2567 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:42.286627 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.286618 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:42.286737 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.286648 2567 factory.go:103] Registering Raw factory Apr 24 21:27:42.286737 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.286664 2567 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:42.287506 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.287492 2567 manager.go:319] Starting recovery of all containers Apr 24 21:27:42.295088 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.295019 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gbfsn" Apr 24 21:27:42.298369 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.297884 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-242.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:27:42.298369 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.298024 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:27:42.299812 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.299793 2567 manager.go:324] Recovery completed Apr 24 21:27:42.303906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.303893 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:42.308601 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.308569 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:42.308700 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.308608 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:42.308700 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.308619 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:42.309124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.309110 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:42.309124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.309122 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:42.309232 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.309138 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:42.311327 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.311315 2567 policy_none.go:49] "None policy: Start" Apr 24 21:27:42.311368 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.311331 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:42.311368 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.311341 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:42.356111 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.356098 2567 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:42.356216 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.356135 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:42.356216 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.356148 2567 server.go:85] "Starting device plugin registration server" Apr 24 21:27:42.356391 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.356379 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:42.356426 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.356392 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:42.356513 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.356496 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:42.356600 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.356590 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:42.356655 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.356603 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:42.357068 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.357047 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:42.357148 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.357089 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:42.410886 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.410863 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:42.412062 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.412041 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:42.412136 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.412067 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:42.412136 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.412086 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:42.412136 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.412093 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:42.412256 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.412174 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:42.415065 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.415043 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:42.456781 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.456745 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:42.457519 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.457495 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:42.457626 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.457526 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:42.457626 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.457536 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:42.457626 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.457559 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.465624 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.465610 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.465709 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.465631 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-242.ec2.internal\": node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:42.477817 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.477802 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:42.513214 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.513183 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal"] Apr 24 21:27:42.513309 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.513258 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:42.514038 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.514024 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:42.514109 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.514049 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:42.514109 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.514059 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:42.515281 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.515269 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:42.515428 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.515415 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.515471 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.515442 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:42.515971 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.515948 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:42.515971 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.515951 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:42.516096 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.515994 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:42.516096 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.516005 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:42.516096 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.515973 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:42.516096 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.516061 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:42.517298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.517284 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.517373 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.517314 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:42.517977 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.517961 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:42.518049 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.517987 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:42.518049 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.517998 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:42.541153 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.541131 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-242.ec2.internal\" not found" node="ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.545380 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.545364 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-242.ec2.internal\" not found" node="ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.578528 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.578510 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:42.588162 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.588144 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/74155cf91f76647e831a5222c2f5f3cc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal\" (UID: \"74155cf91f76647e831a5222c2f5f3cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.588237 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.588169 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74155cf91f76647e831a5222c2f5f3cc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal\" (UID: \"74155cf91f76647e831a5222c2f5f3cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.588237 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.588185 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2b8950f4a211b4f3789a7a4ccb32fafe-config\") pod \"kube-apiserver-proxy-ip-10-0-142-242.ec2.internal\" (UID: \"2b8950f4a211b4f3789a7a4ccb32fafe\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.679142 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.679103 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:42.688600 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.688551 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/74155cf91f76647e831a5222c2f5f3cc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal\" (UID: \"74155cf91f76647e831a5222c2f5f3cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.688718 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.688614 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74155cf91f76647e831a5222c2f5f3cc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal\" (UID: \"74155cf91f76647e831a5222c2f5f3cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.688718 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.688652 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/74155cf91f76647e831a5222c2f5f3cc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal\" (UID: \"74155cf91f76647e831a5222c2f5f3cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.688718 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.688655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74155cf91f76647e831a5222c2f5f3cc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal\" (UID: \"74155cf91f76647e831a5222c2f5f3cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.688718 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.688701 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2b8950f4a211b4f3789a7a4ccb32fafe-config\") pod \"kube-apiserver-proxy-ip-10-0-142-242.ec2.internal\" (UID: \"2b8950f4a211b4f3789a7a4ccb32fafe\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.688718 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.688658 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2b8950f4a211b4f3789a7a4ccb32fafe-config\") pod \"kube-apiserver-proxy-ip-10-0-142-242.ec2.internal\" (UID: \"2b8950f4a211b4f3789a7a4ccb32fafe\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.780025 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.779955 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:42.842509 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.842482 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.848107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:42.848088 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal" Apr 24 21:27:42.880633 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.880604 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:42.981157 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:42.981107 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:43.081861 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:43.081776 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:43.182475 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:43.182433 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:43.200971 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.200943 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:43.201522 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.201079 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:43.282768 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:43.282616 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:43.283859 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.283842 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:43.297520 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.297499 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:43.299169 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.299135 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:42 +0000 UTC" deadline="2027-11-05 02:57:45.627500503 +0000 UTC" Apr 24 21:27:43.299169 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.299168 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13421h30m2.328334916s" Apr 24 21:27:43.318211 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.318188 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kzlj8" Apr 24 21:27:43.319267 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:43.319238 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74155cf91f76647e831a5222c2f5f3cc.slice/crio-3ccdecb6fad53bbd520dc1e6e1dc97d6806a66a971db26a49f571f5676813e45 WatchSource:0}: Error finding container 3ccdecb6fad53bbd520dc1e6e1dc97d6806a66a971db26a49f571f5676813e45: Status 404 returned error can't find the container with id 3ccdecb6fad53bbd520dc1e6e1dc97d6806a66a971db26a49f571f5676813e45 Apr 24 21:27:43.319789 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:43.319766 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b8950f4a211b4f3789a7a4ccb32fafe.slice/crio-a96848b99f82f571a261342134a5c65d096ffab61bb06a3420acd06985c213aa WatchSource:0}: Error finding container a96848b99f82f571a261342134a5c65d096ffab61bb06a3420acd06985c213aa: Status 404 returned error can't find the container with id a96848b99f82f571a261342134a5c65d096ffab61bb06a3420acd06985c213aa Apr 24 21:27:43.323736 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.323719 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:43.326720 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.326701 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kzlj8" Apr 24 21:27:43.357467 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.357412 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:43.383402 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:43.383382 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:43.414546 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.414508 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" event={"ID":"74155cf91f76647e831a5222c2f5f3cc","Type":"ContainerStarted","Data":"3ccdecb6fad53bbd520dc1e6e1dc97d6806a66a971db26a49f571f5676813e45"} Apr 24 21:27:43.415449 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.415424 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal" event={"ID":"2b8950f4a211b4f3789a7a4ccb32fafe","Type":"ContainerStarted","Data":"a96848b99f82f571a261342134a5c65d096ffab61bb06a3420acd06985c213aa"} Apr 24 21:27:43.483643 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:43.483620 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-242.ec2.internal\" not found" Apr 24 21:27:43.507527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.507512 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:43.584536 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.584513 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal" Apr 24 21:27:43.594379 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.594360 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:43.595473 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.595462 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" Apr 24 21:27:43.613103 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.613027 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:43.859913 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:43.859886 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:44.039796 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.039759 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:44.264695 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.264660 2567 apiserver.go:52] "Watching apiserver" Apr 24 21:27:44.271026 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.271000 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:44.271354 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.271329 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf","openshift-cluster-node-tuning-operator/tuned-xh68j","openshift-image-registry/node-ca-whp9g","openshift-multus/multus-2f989","openshift-multus/multus-additional-cni-plugins-x5xkp","openshift-network-diagnostics/network-check-target-jjnzv","kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal","openshift-multus/network-metrics-daemon-x6s8x","openshift-network-operator/iptables-alerter-dhdqg","openshift-ovn-kubernetes/ovnkube-node-mxnmf","kube-system/konnectivity-agent-dzcxv"] Apr 24 21:27:44.273510 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.273489 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.275830 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.275810 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.276120 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.276083 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:44.276321 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.276296 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:44.276692 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.276665 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zgmsz\"" Apr 24 21:27:44.276784 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.276698 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:44.277169 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.277153 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.277308 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.277291 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2f989" Apr 24 21:27:44.278094 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.278077 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:44.278182 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.278170 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dcxfz\"" Apr 24 21:27:44.278475 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.278456 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:44.278629 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.278608 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.279045 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.279028 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:44.279251 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.279218 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:44.279466 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.279445 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-sp2qp\"" Apr 24 21:27:44.280098 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.279763 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:44.280098 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.279945 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:44.280098 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.279942 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:44.280324 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.280216 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jb9zv\"" Apr 24 21:27:44.280324 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.280236 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:44.280324 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.280315 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:44.280724 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.280702 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:44.280929 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.280912 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:44.281342 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.281166 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5gw7j\"" Apr 24 21:27:44.281342 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.281235 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:44.281342 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:44.281296 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:27:44.282554 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.282480 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:44.282652 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:44.282552 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:27:44.282699 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.282655 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.284125 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.284108 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.284728 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.284710 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:44.284811 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.284802 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:44.284879 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.284817 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-s8f2n\"" Apr 24 21:27:44.285054 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.285027 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:44.285696 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.285678 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:27:44.286520 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.286469 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:44.286646 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.286535 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:44.287368 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.287275 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:44.287531 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.287515 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:44.288066 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.288044 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ncv4l\"" Apr 24 21:27:44.288266 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.288222 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:44.288821 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.288803 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:44.288994 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.288976 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:44.289097 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.289082 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:44.289256 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.289240 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wv7f2\"" Apr 24 21:27:44.290071 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.289858 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:44.297448 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297422 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-modprobe-d\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.297542 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-kubernetes\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.297542 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297481 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-run\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.297542 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297507 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-cni-dir\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.297797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297662 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-cni-netd\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.297797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297701 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-host\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.297797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297731 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-run-k8s-cni-cncf-io\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.297797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297757 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-node-log\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.297797 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297781 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.298021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297805 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-cnibin\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.298021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297829 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44dxm\" (UniqueName: \"kubernetes.io/projected/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-kube-api-access-44dxm\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.298021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297851 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d61d5422-402f-4ecf-8f78-effcda482ce0-host-slash\") pod \"iptables-alerter-dhdqg\" (UID: \"d61d5422-402f-4ecf-8f78-effcda482ce0\") " pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.298021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297874 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-run-ovn\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.298021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297895 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-sysconfig\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.298021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297914 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-sysctl-d\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.298021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297931 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-sysctl-conf\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.298021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297953 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-tuned\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.298021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.297987 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:44.298021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298017 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thpb8\" (UniqueName: \"kubernetes.io/projected/52f8223b-f29e-4bac-bf1e-475d1a24a90c-kube-api-access-thpb8\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-etc-openvswitch\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298062 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-cni-bin\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298079 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dbf1909-aeaf-429a-9020-a9384e13a292-ovnkube-config\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298120 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-systemd\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298152 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-system-cni-dir\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298184 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298209 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-device-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298233 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b888a29e-e580-4114-8441-9109c5db53fd-serviceca\") pod \"node-ca-whp9g\" (UID: \"b888a29e-e580-4114-8441-9109c5db53fd\") " pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298265 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-cni-binary-copy\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298299 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-conf-dir\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298344 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-systemd-units\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298383 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsg2r\" (UniqueName: \"kubernetes.io/projected/f4dbd268-c323-40be-8c10-6478beb5cecc-kube-api-access-dsg2r\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298406 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-sys\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.298457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298433 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-os-release\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298489 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298525 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-etc-selinux\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298558 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-cnibin\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298604 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-os-release\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298628 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-log-socket\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298649 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-lib-modules\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298671 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-var-lib-kubelet\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298694 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-tmp\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298717 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298744 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-var-lib-kubelet\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298768 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-sys-fs\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298790 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf7sw\" (UniqueName: \"kubernetes.io/projected/b888a29e-e580-4114-8441-9109c5db53fd-kube-api-access-bf7sw\") pod \"node-ca-whp9g\" (UID: \"b888a29e-e580-4114-8441-9109c5db53fd\") " pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298834 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-run-netns\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298860 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-var-lib-cni-bin\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-run-openvswitch\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.299107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298905 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-run-ovn-kubernetes\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298940 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdzwv\" (UniqueName: \"kubernetes.io/projected/5dbf1909-aeaf-429a-9020-a9384e13a292-kube-api-access-fdzwv\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298974 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpsk5\" (UniqueName: \"kubernetes.io/projected/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-kube-api-access-wpsk5\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.298995 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-cni-binary-copy\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299009 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b888a29e-e580-4114-8441-9109c5db53fd-host\") pod \"node-ca-whp9g\" (UID: \"b888a29e-e580-4114-8441-9109c5db53fd\") " pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299027 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-daemon-config\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299058 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-etc-kubernetes\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299089 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdtz2\" (UniqueName: \"kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2\") pod \"network-check-target-jjnzv\" (UID: \"63ad21c9-3529-456b-b932-b0cb7555c6a5\") " pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299113 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-var-lib-openvswitch\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299139 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299165 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-system-cni-dir\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299186 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-hostroot\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299220 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-run-multus-certs\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299243 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-slash\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299269 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvm2q\" (UniqueName: \"kubernetes.io/projected/d61d5422-402f-4ecf-8f78-effcda482ce0-kube-api-access-nvm2q\") pod \"iptables-alerter-dhdqg\" (UID: \"d61d5422-402f-4ecf-8f78-effcda482ce0\") " pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299297 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-socket-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.299757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5dbf1909-aeaf-429a-9020-a9384e13a292-ovnkube-script-lib\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299339 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/54da82a7-17e0-4f28-b08b-90fd402af6ec-agent-certs\") pod \"konnectivity-agent-dzcxv\" (UID: \"54da82a7-17e0-4f28-b08b-90fd402af6ec\") " pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299374 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-registration-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299406 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g2vl\" (UniqueName: \"kubernetes.io/projected/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-kube-api-access-7g2vl\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299434 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d61d5422-402f-4ecf-8f78-effcda482ce0-iptables-alerter-script\") pod \"iptables-alerter-dhdqg\" (UID: \"d61d5422-402f-4ecf-8f78-effcda482ce0\") " pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299458 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-socket-dir-parent\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299483 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-var-lib-cni-multus\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299522 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-kubelet\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299557 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/54da82a7-17e0-4f28-b08b-90fd402af6ec-konnectivity-ca\") pod \"konnectivity-agent-dzcxv\" (UID: \"54da82a7-17e0-4f28-b08b-90fd402af6ec\") " pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-run-netns\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299639 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-run-systemd\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299662 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dbf1909-aeaf-429a-9020-a9384e13a292-env-overrides\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.300298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.299685 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dbf1909-aeaf-429a-9020-a9384e13a292-ovn-node-metrics-cert\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.327769 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.327730 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:43 +0000 UTC" deadline="2027-10-24 18:32:12.260714672 +0000 UTC" Apr 24 21:27:44.327769 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.327768 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13149h4m27.932948539s" Apr 24 21:27:44.400840 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.400801 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.401004 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.400846 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-etc-selinux\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.401004 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.400873 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-cnibin\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.401004 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.400898 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-os-release\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.401004 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.400925 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-log-socket\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.401004 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.400948 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-lib-modules\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.401004 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.400964 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-cnibin\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.401004 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.400983 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-var-lib-kubelet\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.401004 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.400992 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-log-socket\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.401004 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-tmp\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401028 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-etc-selinux\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401041 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-os-release\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401064 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401092 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401119 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-lib-modules\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401125 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-var-lib-kubelet\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-sys-fs\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401178 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bf7sw\" (UniqueName: \"kubernetes.io/projected/b888a29e-e580-4114-8441-9109c5db53fd-kube-api-access-bf7sw\") pod \"node-ca-whp9g\" (UID: \"b888a29e-e580-4114-8441-9109c5db53fd\") " pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-run-netns\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401214 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-var-lib-kubelet\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401219 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-var-lib-cni-bin\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401328 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:44.401441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401265 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-var-lib-cni-bin\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.402061 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401572 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-var-lib-kubelet\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.402061 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401632 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.402061 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401874 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-sys-fs\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.402061 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401944 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-run-netns\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.402061 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401951 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-run-openvswitch\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.402061 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.401981 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-run-ovn-kubernetes\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.402061 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402009 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdzwv\" (UniqueName: \"kubernetes.io/projected/5dbf1909-aeaf-429a-9020-a9384e13a292-kube-api-access-fdzwv\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.402061 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402032 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpsk5\" (UniqueName: \"kubernetes.io/projected/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-kube-api-access-wpsk5\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.402061 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402035 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-run-openvswitch\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.402061 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402056 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-cni-binary-copy\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b888a29e-e580-4114-8441-9109c5db53fd-host\") pod \"node-ca-whp9g\" (UID: \"b888a29e-e580-4114-8441-9109c5db53fd\") " pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-run-ovn-kubernetes\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402117 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-daemon-config\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402150 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-etc-kubernetes\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402193 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtz2\" (UniqueName: \"kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2\") pod \"network-check-target-jjnzv\" (UID: \"63ad21c9-3529-456b-b932-b0cb7555c6a5\") " pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-var-lib-openvswitch\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402252 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402280 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-system-cni-dir\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402307 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-hostroot\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402338 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-run-multus-certs\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402362 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-slash\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402393 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvm2q\" (UniqueName: \"kubernetes.io/projected/d61d5422-402f-4ecf-8f78-effcda482ce0-kube-api-access-nvm2q\") pod \"iptables-alerter-dhdqg\" (UID: \"d61d5422-402f-4ecf-8f78-effcda482ce0\") " pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402418 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-socket-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402444 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b888a29e-e580-4114-8441-9109c5db53fd-host\") pod \"node-ca-whp9g\" (UID: \"b888a29e-e580-4114-8441-9109c5db53fd\") " pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402451 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5dbf1909-aeaf-429a-9020-a9384e13a292-ovnkube-script-lib\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/54da82a7-17e0-4f28-b08b-90fd402af6ec-agent-certs\") pod \"konnectivity-agent-dzcxv\" (UID: \"54da82a7-17e0-4f28-b08b-90fd402af6ec\") " pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:27:44.402530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-registration-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402549 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7g2vl\" (UniqueName: \"kubernetes.io/projected/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-kube-api-access-7g2vl\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402576 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d61d5422-402f-4ecf-8f78-effcda482ce0-iptables-alerter-script\") pod \"iptables-alerter-dhdqg\" (UID: \"d61d5422-402f-4ecf-8f78-effcda482ce0\") " pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-socket-dir-parent\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402656 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-cni-binary-copy\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402668 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-var-lib-cni-multus\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402688 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-daemon-config\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402701 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-kubelet\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402738 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/54da82a7-17e0-4f28-b08b-90fd402af6ec-konnectivity-ca\") pod \"konnectivity-agent-dzcxv\" (UID: \"54da82a7-17e0-4f28-b08b-90fd402af6ec\") " pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402739 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-kubelet\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402792 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-system-cni-dir\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402829 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-run-netns\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402858 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-run-systemd\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402887 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dbf1909-aeaf-429a-9020-a9384e13a292-env-overrides\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402915 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dbf1909-aeaf-429a-9020-a9384e13a292-ovn-node-metrics-cert\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402955 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-modprobe-d\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403002 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-slash\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.403312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403016 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5dbf1909-aeaf-429a-9020-a9384e13a292-ovnkube-script-lib\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403053 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-kubernetes\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403066 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-etc-kubernetes\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403088 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-run-multus-certs\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403001 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-kubernetes\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403094 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-hostroot\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.402703 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-var-lib-cni-multus\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403143 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-var-lib-openvswitch\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403185 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-run-systemd\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403188 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/54da82a7-17e0-4f28-b08b-90fd402af6ec-konnectivity-ca\") pod \"konnectivity-agent-dzcxv\" (UID: \"54da82a7-17e0-4f28-b08b-90fd402af6ec\") " pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403188 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-socket-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403190 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-socket-dir-parent\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403256 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-registration-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403296 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-modprobe-d\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403329 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-run-netns\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403352 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-run\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403385 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-cni-dir\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403436 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-cni-netd\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.404124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403462 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-host\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403486 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-run-k8s-cni-cncf-io\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403520 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-node-log\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403530 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-host-run-k8s-cni-cncf-io\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403550 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403556 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dbf1909-aeaf-429a-9020-a9384e13a292-env-overrides\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403574 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-cnibin\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403602 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-node-log\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403607 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-run\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-cnibin\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403639 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403662 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-cni-dir\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403651 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-host\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403684 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44dxm\" (UniqueName: \"kubernetes.io/projected/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-kube-api-access-44dxm\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403710 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-cni-netd\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403712 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d61d5422-402f-4ecf-8f78-effcda482ce0-host-slash\") pod \"iptables-alerter-dhdqg\" (UID: \"d61d5422-402f-4ecf-8f78-effcda482ce0\") " pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403742 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-run-ovn\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403777 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-sysconfig\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.404954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403795 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d61d5422-402f-4ecf-8f78-effcda482ce0-host-slash\") pod \"iptables-alerter-dhdqg\" (UID: \"d61d5422-402f-4ecf-8f78-effcda482ce0\") " pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403800 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-sysctl-d\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403818 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-run-ovn\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-sysctl-conf\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403859 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-tuned\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403864 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-sysconfig\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403892 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-sysctl-d\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403920 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403946 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thpb8\" (UniqueName: \"kubernetes.io/projected/52f8223b-f29e-4bac-bf1e-475d1a24a90c-kube-api-access-thpb8\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403995 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-etc-openvswitch\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404021 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-cni-bin\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:44.404037 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404044 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dbf1909-aeaf-429a-9020-a9384e13a292-ovnkube-config\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404068 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-systemd\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:44.404115 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs podName:52f8223b-f29e-4bac-bf1e-475d1a24a90c nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.904076917 +0000 UTC m=+3.002489432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs") pod "network-metrics-daemon-x6s8x" (UID: "52f8223b-f29e-4bac-bf1e-475d1a24a90c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404114 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404126 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-systemd\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.405789 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404134 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-system-cni-dir\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.403965 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-sysctl-conf\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404161 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404179 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-host-cni-bin\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404248 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-system-cni-dir\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404279 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404259 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-device-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404217 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-etc-openvswitch\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404315 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b888a29e-e580-4114-8441-9109c5db53fd-serviceca\") pod \"node-ca-whp9g\" (UID: \"b888a29e-e580-4114-8441-9109c5db53fd\") " pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404340 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-cni-binary-copy\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f4dbd268-c323-40be-8c10-6478beb5cecc-device-dir\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404363 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-conf-dir\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404402 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-multus-conf-dir\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404431 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-systemd-units\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404458 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsg2r\" (UniqueName: \"kubernetes.io/projected/f4dbd268-c323-40be-8c10-6478beb5cecc-kube-api-access-dsg2r\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404483 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-sys\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404506 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-os-release\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.406565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404604 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-os-release\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.407395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404643 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5dbf1909-aeaf-429a-9020-a9384e13a292-systemd-units\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.407395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d61d5422-402f-4ecf-8f78-effcda482ce0-iptables-alerter-script\") pod \"iptables-alerter-dhdqg\" (UID: \"d61d5422-402f-4ecf-8f78-effcda482ce0\") " pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.407395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404715 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b888a29e-e580-4114-8441-9109c5db53fd-serviceca\") pod \"node-ca-whp9g\" (UID: \"b888a29e-e580-4114-8441-9109c5db53fd\") " pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.407395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404752 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-sys\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.407395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404751 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dbf1909-aeaf-429a-9020-a9384e13a292-ovnkube-config\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.407395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.404824 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-cni-binary-copy\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.407395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.405311 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-tmp\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.407395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.406530 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dbf1909-aeaf-429a-9020-a9384e13a292-ovn-node-metrics-cert\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.407395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.407304 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/54da82a7-17e0-4f28-b08b-90fd402af6ec-agent-certs\") pod \"konnectivity-agent-dzcxv\" (UID: \"54da82a7-17e0-4f28-b08b-90fd402af6ec\") " pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:27:44.407395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.407372 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-etc-tuned\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.410971 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.410946 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf7sw\" (UniqueName: \"kubernetes.io/projected/b888a29e-e580-4114-8441-9109c5db53fd-kube-api-access-bf7sw\") pod \"node-ca-whp9g\" (UID: \"b888a29e-e580-4114-8441-9109c5db53fd\") " pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.411086 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:44.411075 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:44.411145 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:44.411101 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:44.411145 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:44.411115 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mdtz2 for pod openshift-network-diagnostics/network-check-target-jjnzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:44.411238 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:44.411183 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2 podName:63ad21c9-3529-456b-b932-b0cb7555c6a5 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.911164781 +0000 UTC m=+3.009577284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mdtz2" (UniqueName: "kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2") pod "network-check-target-jjnzv" (UID: "63ad21c9-3529-456b-b932-b0cb7555c6a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:44.413003 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.411606 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpsk5\" (UniqueName: \"kubernetes.io/projected/7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0-kube-api-access-wpsk5\") pod \"tuned-xh68j\" (UID: \"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0\") " pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.413339 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.413316 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g2vl\" (UniqueName: \"kubernetes.io/projected/7e34f57d-6789-43e3-8a4f-f5b55dd1ace1-kube-api-access-7g2vl\") pod \"multus-2f989\" (UID: \"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1\") " pod="openshift-multus/multus-2f989" Apr 24 21:27:44.414850 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.414823 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thpb8\" (UniqueName: \"kubernetes.io/projected/52f8223b-f29e-4bac-bf1e-475d1a24a90c-kube-api-access-thpb8\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:44.415786 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.415280 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdzwv\" (UniqueName: \"kubernetes.io/projected/5dbf1909-aeaf-429a-9020-a9384e13a292-kube-api-access-fdzwv\") pod \"ovnkube-node-mxnmf\" (UID: \"5dbf1909-aeaf-429a-9020-a9384e13a292\") " pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.415786 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.415621 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsg2r\" (UniqueName: \"kubernetes.io/projected/f4dbd268-c323-40be-8c10-6478beb5cecc-kube-api-access-dsg2r\") pod \"aws-ebs-csi-driver-node-wplmf\" (UID: \"f4dbd268-c323-40be-8c10-6478beb5cecc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.415786 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.415722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvm2q\" (UniqueName: \"kubernetes.io/projected/d61d5422-402f-4ecf-8f78-effcda482ce0-kube-api-access-nvm2q\") pod \"iptables-alerter-dhdqg\" (UID: \"d61d5422-402f-4ecf-8f78-effcda482ce0\") " pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.417715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.417692 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44dxm\" (UniqueName: \"kubernetes.io/projected/99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae-kube-api-access-44dxm\") pod \"multus-additional-cni-plugins-x5xkp\" (UID: \"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae\") " pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.585269 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.585170 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" Apr 24 21:27:44.594287 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.594258 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xh68j" Apr 24 21:27:44.602779 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.602759 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-whp9g" Apr 24 21:27:44.607425 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.607405 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2f989" Apr 24 21:27:44.612975 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.612957 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" Apr 24 21:27:44.620506 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.620487 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dhdqg" Apr 24 21:27:44.626107 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.626089 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:27:44.631659 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.631642 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:27:44.907572 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:44.907543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:44.907740 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:44.907690 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:44.907793 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:44.907764 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs podName:52f8223b-f29e-4bac-bf1e-475d1a24a90c nodeName:}" failed. No retries permitted until 2026-04-24 21:27:45.907742176 +0000 UTC m=+4.006154690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs") pod "network-metrics-daemon-x6s8x" (UID: "52f8223b-f29e-4bac-bf1e-475d1a24a90c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:44.972707 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:44.972679 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb888a29e_e580_4114_8441_9109c5db53fd.slice/crio-07680c9761d050a0742f61a49b494be918c4b6d6b5122440a63df7daab89fae2 WatchSource:0}: Error finding container 07680c9761d050a0742f61a49b494be918c4b6d6b5122440a63df7daab89fae2: Status 404 returned error can't find the container with id 07680c9761d050a0742f61a49b494be918c4b6d6b5122440a63df7daab89fae2 Apr 24 21:27:44.973834 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:44.973807 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4dbd268_c323_40be_8c10_6478beb5cecc.slice/crio-42eb9baef57cbfb310a832969b7782f9f10915561b89eef73b08aa0c90311b34 WatchSource:0}: Error finding container 42eb9baef57cbfb310a832969b7782f9f10915561b89eef73b08aa0c90311b34: Status 404 returned error can't find the container with id 42eb9baef57cbfb310a832969b7782f9f10915561b89eef73b08aa0c90311b34 Apr 24 21:27:44.975230 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:44.974668 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54da82a7_17e0_4f28_b08b_90fd402af6ec.slice/crio-e014d4679ce923fee96cd93b9fb484c774dd4d344aa93d6038782bfcfb3d6e87 WatchSource:0}: Error finding container e014d4679ce923fee96cd93b9fb484c774dd4d344aa93d6038782bfcfb3d6e87: Status 404 returned error can't find the container with id e014d4679ce923fee96cd93b9fb484c774dd4d344aa93d6038782bfcfb3d6e87 Apr 24 21:27:44.977370 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:44.977345 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61d5422_402f_4ecf_8f78_effcda482ce0.slice/crio-70f2a3a79acbff2fa6af4d614b5bc409ec5e7c786db7f4ef514e1e3eab54232f WatchSource:0}: Error finding container 70f2a3a79acbff2fa6af4d614b5bc409ec5e7c786db7f4ef514e1e3eab54232f: Status 404 returned error can't find the container with id 70f2a3a79acbff2fa6af4d614b5bc409ec5e7c786db7f4ef514e1e3eab54232f Apr 24 21:27:44.979055 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:44.979033 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e34f57d_6789_43e3_8a4f_f5b55dd1ace1.slice/crio-3dbc309f94f9328061f4840d74752d4fe025e77670fa3c356f468c6948123030 WatchSource:0}: Error finding container 3dbc309f94f9328061f4840d74752d4fe025e77670fa3c356f468c6948123030: Status 404 returned error can't find the container with id 3dbc309f94f9328061f4840d74752d4fe025e77670fa3c356f468c6948123030 Apr 24 21:27:44.980686 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:44.980638 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fce4fa5_eb1b_4b79_a704_d0ce7f7a5cc0.slice/crio-a3b818ebb41df0924ea902750d3df89672d11c513dbbfe2b67c70454e79f8e21 WatchSource:0}: Error finding container a3b818ebb41df0924ea902750d3df89672d11c513dbbfe2b67c70454e79f8e21: Status 404 returned error can't find the container with id a3b818ebb41df0924ea902750d3df89672d11c513dbbfe2b67c70454e79f8e21 Apr 24 21:27:44.981474 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:44.981389 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99a3f1ef_3d99_4a08_ab2a_bf05ceca8dae.slice/crio-d86e1bc7a9896b23337395899ce97620876f6b7c2aa9b7b6a0a943189c80f2dc WatchSource:0}: Error finding container d86e1bc7a9896b23337395899ce97620876f6b7c2aa9b7b6a0a943189c80f2dc: Status 404 returned error can't find the container with id d86e1bc7a9896b23337395899ce97620876f6b7c2aa9b7b6a0a943189c80f2dc Apr 24 21:27:44.982491 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:27:44.982416 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dbf1909_aeaf_429a_9020_a9384e13a292.slice/crio-ee5bd3adee911100890f056792babb8ddf8b2b977139351ba200a13aae10ba84 WatchSource:0}: Error finding container ee5bd3adee911100890f056792babb8ddf8b2b977139351ba200a13aae10ba84: Status 404 returned error can't find the container with id ee5bd3adee911100890f056792babb8ddf8b2b977139351ba200a13aae10ba84 Apr 24 21:27:45.008392 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.008365 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtz2\" (UniqueName: \"kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2\") pod \"network-check-target-jjnzv\" (UID: \"63ad21c9-3529-456b-b932-b0cb7555c6a5\") " pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:45.008513 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:45.008496 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:45.008575 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:45.008521 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:45.008575 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:45.008534 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mdtz2 for pod openshift-network-diagnostics/network-check-target-jjnzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:45.008682 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:45.008609 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2 podName:63ad21c9-3529-456b-b932-b0cb7555c6a5 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:46.008574387 +0000 UTC m=+4.106986881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdtz2" (UniqueName: "kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2") pod "network-check-target-jjnzv" (UID: "63ad21c9-3529-456b-b932-b0cb7555c6a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:45.328413 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.328323 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:43 +0000 UTC" deadline="2027-12-29 02:11:44.454644139 +0000 UTC" Apr 24 21:27:45.328413 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.328366 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14716h43m59.126281177s" Apr 24 21:27:45.429497 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.428774 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal" event={"ID":"2b8950f4a211b4f3789a7a4ccb32fafe","Type":"ContainerStarted","Data":"c0845ab9006c766f0178a879df8c668efb169bda2307e8749d6055eb489cfdc7"} Apr 24 21:27:45.434040 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.433979 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" event={"ID":"5dbf1909-aeaf-429a-9020-a9384e13a292","Type":"ContainerStarted","Data":"ee5bd3adee911100890f056792babb8ddf8b2b977139351ba200a13aae10ba84"} Apr 24 21:27:45.436295 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.436270 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" event={"ID":"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae","Type":"ContainerStarted","Data":"d86e1bc7a9896b23337395899ce97620876f6b7c2aa9b7b6a0a943189c80f2dc"} Apr 24 21:27:45.443566 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.443539 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2f989" event={"ID":"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1","Type":"ContainerStarted","Data":"3dbc309f94f9328061f4840d74752d4fe025e77670fa3c356f468c6948123030"} Apr 24 21:27:45.456643 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.456617 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dhdqg" event={"ID":"d61d5422-402f-4ecf-8f78-effcda482ce0","Type":"ContainerStarted","Data":"70f2a3a79acbff2fa6af4d614b5bc409ec5e7c786db7f4ef514e1e3eab54232f"} Apr 24 21:27:45.460436 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.460406 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dzcxv" event={"ID":"54da82a7-17e0-4f28-b08b-90fd402af6ec","Type":"ContainerStarted","Data":"e014d4679ce923fee96cd93b9fb484c774dd4d344aa93d6038782bfcfb3d6e87"} Apr 24 21:27:45.468203 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.468179 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-whp9g" event={"ID":"b888a29e-e580-4114-8441-9109c5db53fd","Type":"ContainerStarted","Data":"07680c9761d050a0742f61a49b494be918c4b6d6b5122440a63df7daab89fae2"} Apr 24 21:27:45.476045 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.476017 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xh68j" event={"ID":"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0","Type":"ContainerStarted","Data":"a3b818ebb41df0924ea902750d3df89672d11c513dbbfe2b67c70454e79f8e21"} Apr 24 21:27:45.479747 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.479721 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" event={"ID":"f4dbd268-c323-40be-8c10-6478beb5cecc","Type":"ContainerStarted","Data":"42eb9baef57cbfb310a832969b7782f9f10915561b89eef73b08aa0c90311b34"} Apr 24 21:27:45.917786 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:45.917755 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:45.917947 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:45.917925 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:45.918022 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:45.918009 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs podName:52f8223b-f29e-4bac-bf1e-475d1a24a90c nodeName:}" failed. No retries permitted until 2026-04-24 21:27:47.917987504 +0000 UTC m=+6.016400017 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs") pod "network-metrics-daemon-x6s8x" (UID: "52f8223b-f29e-4bac-bf1e-475d1a24a90c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:46.019471 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:46.018844 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtz2\" (UniqueName: \"kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2\") pod \"network-check-target-jjnzv\" (UID: \"63ad21c9-3529-456b-b932-b0cb7555c6a5\") " pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:46.019471 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:46.019005 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:46.019471 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:46.019027 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:46.019471 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:46.019040 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mdtz2 for pod openshift-network-diagnostics/network-check-target-jjnzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:46.019471 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:46.019097 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2 podName:63ad21c9-3529-456b-b932-b0cb7555c6a5 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:48.019079078 +0000 UTC m=+6.117491583 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdtz2" (UniqueName: "kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2") pod "network-check-target-jjnzv" (UID: "63ad21c9-3529-456b-b932-b0cb7555c6a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:46.413768 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:46.413739 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:46.414297 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:46.413909 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:27:46.416030 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:46.416010 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:46.416155 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:46.416107 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:27:46.498867 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:46.498827 2567 generic.go:358] "Generic (PLEG): container finished" podID="74155cf91f76647e831a5222c2f5f3cc" containerID="d60d25059502096578c17e850072593e654e7309e0d7e43616fb825ec6414ee5" exitCode=0 Apr 24 21:27:46.499038 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:46.498939 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" event={"ID":"74155cf91f76647e831a5222c2f5f3cc","Type":"ContainerDied","Data":"d60d25059502096578c17e850072593e654e7309e0d7e43616fb825ec6414ee5"} Apr 24 21:27:46.514523 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:46.514477 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-242.ec2.internal" podStartSLOduration=3.514461109 podStartE2EDuration="3.514461109s" podCreationTimestamp="2026-04-24 21:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:45.443418108 +0000 UTC m=+3.541830630" watchObservedRunningTime="2026-04-24 21:27:46.514461109 +0000 UTC m=+4.612873645" Apr 24 21:27:47.504609 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:47.503920 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" event={"ID":"74155cf91f76647e831a5222c2f5f3cc","Type":"ContainerStarted","Data":"dc7a22840805fbf998d6046a53bb198408be8196e8e2f2343e29c10b65467733"} Apr 24 21:27:47.520329 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:47.519917 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-242.ec2.internal" podStartSLOduration=4.519903799 podStartE2EDuration="4.519903799s" podCreationTimestamp="2026-04-24 21:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:47.519108684 +0000 UTC m=+5.617521199" watchObservedRunningTime="2026-04-24 21:27:47.519903799 +0000 UTC m=+5.618316312" Apr 24 21:27:47.935055 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:47.935017 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:47.935255 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:47.935183 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:47.935255 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:47.935248 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs podName:52f8223b-f29e-4bac-bf1e-475d1a24a90c nodeName:}" failed. No retries permitted until 2026-04-24 21:27:51.935229889 +0000 UTC m=+10.033642393 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs") pod "network-metrics-daemon-x6s8x" (UID: "52f8223b-f29e-4bac-bf1e-475d1a24a90c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:48.035676 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:48.035547 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtz2\" (UniqueName: \"kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2\") pod \"network-check-target-jjnzv\" (UID: \"63ad21c9-3529-456b-b932-b0cb7555c6a5\") " pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:48.035835 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:48.035742 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:48.035835 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:48.035762 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:48.035835 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:48.035775 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mdtz2 for pod openshift-network-diagnostics/network-check-target-jjnzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:48.035835 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:48.035828 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2 podName:63ad21c9-3529-456b-b932-b0cb7555c6a5 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.035811711 +0000 UTC m=+10.134224202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdtz2" (UniqueName: "kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2") pod "network-check-target-jjnzv" (UID: "63ad21c9-3529-456b-b932-b0cb7555c6a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:48.414102 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:48.414072 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:48.414272 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:48.414072 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:48.414272 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:48.414218 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:27:48.414368 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:48.414271 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:27:50.413036 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:50.412984 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:50.413685 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:50.413029 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:50.413685 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:50.413635 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:27:50.413827 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:50.413745 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:27:51.970038 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:51.969989 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:51.970498 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:51.970160 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:51.970498 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:51.970243 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs podName:52f8223b-f29e-4bac-bf1e-475d1a24a90c nodeName:}" failed. No retries permitted until 2026-04-24 21:27:59.970222924 +0000 UTC m=+18.068635430 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs") pod "network-metrics-daemon-x6s8x" (UID: "52f8223b-f29e-4bac-bf1e-475d1a24a90c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:52.071069 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:52.071029 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtz2\" (UniqueName: \"kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2\") pod \"network-check-target-jjnzv\" (UID: \"63ad21c9-3529-456b-b932-b0cb7555c6a5\") " pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:52.071239 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:52.071214 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:52.071312 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:52.071240 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:52.071312 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:52.071255 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mdtz2 for pod openshift-network-diagnostics/network-check-target-jjnzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:52.071406 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:52.071318 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2 podName:63ad21c9-3529-456b-b932-b0cb7555c6a5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:00.071299508 +0000 UTC m=+18.169712005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdtz2" (UniqueName: "kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2") pod "network-check-target-jjnzv" (UID: "63ad21c9-3529-456b-b932-b0cb7555c6a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:52.413448 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:52.413368 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:52.413683 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:52.413502 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:27:52.413683 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:52.413553 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:52.413683 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:52.413633 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:27:54.412714 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:54.412674 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:54.413190 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:54.412715 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:54.413190 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:54.412812 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:27:54.413190 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:54.412957 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:27:56.251978 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.251937 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vwv9h"] Apr 24 21:27:56.268995 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.268967 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:56.269155 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:56.269046 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:27:56.402659 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.402625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d846199e-de26-4ab9-80f8-977e44e27d81-kubelet-config\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:56.402818 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.402677 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:56.402818 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.402763 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d846199e-de26-4ab9-80f8-977e44e27d81-dbus\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:56.412763 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.412729 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:56.412888 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:56.412849 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:27:56.412888 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.412857 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:56.412990 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:56.412954 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:27:56.503872 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.503794 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d846199e-de26-4ab9-80f8-977e44e27d81-dbus\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:56.504017 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.503897 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d846199e-de26-4ab9-80f8-977e44e27d81-kubelet-config\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:56.504017 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.503918 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:56.504135 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:56.504014 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:56.504135 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.504038 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d846199e-de26-4ab9-80f8-977e44e27d81-kubelet-config\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:56.504135 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:56.504008 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d846199e-de26-4ab9-80f8-977e44e27d81-dbus\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:56.504135 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:56.504070 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret podName:d846199e-de26-4ab9-80f8-977e44e27d81 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:57.004056948 +0000 UTC m=+15.102469441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret") pod "global-pull-secret-syncer-vwv9h" (UID: "d846199e-de26-4ab9-80f8-977e44e27d81") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:57.007625 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:57.007599 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:57.007708 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:57.007677 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:57.007749 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:57.007733 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret podName:d846199e-de26-4ab9-80f8-977e44e27d81 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:58.00772035 +0000 UTC m=+16.106132846 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret") pod "global-pull-secret-syncer-vwv9h" (UID: "d846199e-de26-4ab9-80f8-977e44e27d81") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:58.015959 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:58.015930 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:58.016514 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:58.016030 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:58.016514 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:58.016079 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret podName:d846199e-de26-4ab9-80f8-977e44e27d81 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:00.01606575 +0000 UTC m=+18.114478246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret") pod "global-pull-secret-syncer-vwv9h" (UID: "d846199e-de26-4ab9-80f8-977e44e27d81") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:58.412516 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:58.412444 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:27:58.412516 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:58.412484 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:27:58.412775 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:27:58.412447 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:27:58.412775 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:58.412576 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:27:58.412775 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:58.412658 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:27:58.412775 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:27:58.412736 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:28:00.031116 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:00.031070 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:00.031636 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:00.031241 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:00.031636 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.031243 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:00.031636 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.031305 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:00.031636 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.031350 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs podName:52f8223b-f29e-4bac-bf1e-475d1a24a90c nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.031326704 +0000 UTC m=+34.129739198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs") pod "network-metrics-daemon-x6s8x" (UID: "52f8223b-f29e-4bac-bf1e-475d1a24a90c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:00.031636 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.031371 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret podName:d846199e-de26-4ab9-80f8-977e44e27d81 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.031360924 +0000 UTC m=+22.129773416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret") pod "global-pull-secret-syncer-vwv9h" (UID: "d846199e-de26-4ab9-80f8-977e44e27d81") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:00.132444 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:00.132405 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtz2\" (UniqueName: \"kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2\") pod \"network-check-target-jjnzv\" (UID: \"63ad21c9-3529-456b-b932-b0cb7555c6a5\") " pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:00.132626 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.132596 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:00.132626 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.132632 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:00.132724 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.132642 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mdtz2 for pod openshift-network-diagnostics/network-check-target-jjnzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:00.132724 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.132701 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2 podName:63ad21c9-3529-456b-b932-b0cb7555c6a5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.132684612 +0000 UTC m=+34.231097106 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdtz2" (UniqueName: "kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2") pod "network-check-target-jjnzv" (UID: "63ad21c9-3529-456b-b932-b0cb7555c6a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:00.412912 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:00.412872 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:00.413072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:00.412872 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:00.413072 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.412999 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:28:00.413072 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.413054 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:28:00.413072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:00.412880 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:00.413282 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:00.413142 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:28:01.629611 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.629553 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-nzxhn"] Apr 24 21:28:01.644686 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.643226 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:01.655224 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.650474 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.655224 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.650877 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.655224 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.651797 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wwmb2\"" Apr 24 21:28:01.745755 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.745498 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6c3b2011-26ac-4109-9898-ad37c3d322dd-hosts-file\") pod \"node-resolver-nzxhn\" (UID: \"6c3b2011-26ac-4109-9898-ad37c3d322dd\") " pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:01.745850 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.745819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c3b2011-26ac-4109-9898-ad37c3d322dd-tmp-dir\") pod \"node-resolver-nzxhn\" (UID: \"6c3b2011-26ac-4109-9898-ad37c3d322dd\") " pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:01.745920 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.745856 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9hxn\" (UniqueName: \"kubernetes.io/projected/6c3b2011-26ac-4109-9898-ad37c3d322dd-kube-api-access-z9hxn\") pod \"node-resolver-nzxhn\" (UID: \"6c3b2011-26ac-4109-9898-ad37c3d322dd\") " pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:01.846237 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.846210 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c3b2011-26ac-4109-9898-ad37c3d322dd-tmp-dir\") pod \"node-resolver-nzxhn\" (UID: \"6c3b2011-26ac-4109-9898-ad37c3d322dd\") " pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:01.846339 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.846257 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9hxn\" (UniqueName: \"kubernetes.io/projected/6c3b2011-26ac-4109-9898-ad37c3d322dd-kube-api-access-z9hxn\") pod \"node-resolver-nzxhn\" (UID: \"6c3b2011-26ac-4109-9898-ad37c3d322dd\") " pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:01.846339 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.846312 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6c3b2011-26ac-4109-9898-ad37c3d322dd-hosts-file\") pod \"node-resolver-nzxhn\" (UID: \"6c3b2011-26ac-4109-9898-ad37c3d322dd\") " pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:01.846444 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.846387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6c3b2011-26ac-4109-9898-ad37c3d322dd-hosts-file\") pod \"node-resolver-nzxhn\" (UID: \"6c3b2011-26ac-4109-9898-ad37c3d322dd\") " pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:01.846574 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.846551 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c3b2011-26ac-4109-9898-ad37c3d322dd-tmp-dir\") pod \"node-resolver-nzxhn\" (UID: \"6c3b2011-26ac-4109-9898-ad37c3d322dd\") " pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:01.860428 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.860266 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9hxn\" (UniqueName: \"kubernetes.io/projected/6c3b2011-26ac-4109-9898-ad37c3d322dd-kube-api-access-z9hxn\") pod \"node-resolver-nzxhn\" (UID: \"6c3b2011-26ac-4109-9898-ad37c3d322dd\") " pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:01.967654 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:01.967548 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nzxhn" Apr 24 21:28:02.014743 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:02.014704 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c3b2011_26ac_4109_9898_ad37c3d322dd.slice/crio-7098f38ad61c4835a77c50e31765badd860cdb137a3674542c78eb75ffd6f3b0 WatchSource:0}: Error finding container 7098f38ad61c4835a77c50e31765badd860cdb137a3674542c78eb75ffd6f3b0: Status 404 returned error can't find the container with id 7098f38ad61c4835a77c50e31765badd860cdb137a3674542c78eb75ffd6f3b0 Apr 24 21:28:02.413921 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.413890 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:02.414097 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.413933 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:02.414097 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:02.413972 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:28:02.414097 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:02.414020 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:28:02.414097 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.414068 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:02.414301 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:02.414151 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:28:02.527944 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.527906 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xh68j" event={"ID":"7fce4fa5-eb1b-4b79-a704-d0ce7f7a5cc0","Type":"ContainerStarted","Data":"275849c2b83896de80cb853f0185902a6d2543e74e06c0bbe3cd36c1ec35c737"} Apr 24 21:28:02.529320 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.529290 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" event={"ID":"f4dbd268-c323-40be-8c10-6478beb5cecc","Type":"ContainerStarted","Data":"3f5df93ef776974ec495ef1b2e674cd88aa34cd72fe2dae3f2d80db8609ec6e0"} Apr 24 21:28:02.530575 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.530553 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nzxhn" event={"ID":"6c3b2011-26ac-4109-9898-ad37c3d322dd","Type":"ContainerStarted","Data":"7b9b7100d3e61717709fe2d6c56105af3534df99ae4e2690819efd084e5025c7"} Apr 24 21:28:02.530686 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.530600 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nzxhn" event={"ID":"6c3b2011-26ac-4109-9898-ad37c3d322dd","Type":"ContainerStarted","Data":"7098f38ad61c4835a77c50e31765badd860cdb137a3674542c78eb75ffd6f3b0"} Apr 24 21:28:02.533755 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.533720 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" event={"ID":"5dbf1909-aeaf-429a-9020-a9384e13a292","Type":"ContainerStarted","Data":"bae307f230f5c8d910e756501a31825368eaeec12f710d35f7bbf0a779a7d913"} Apr 24 21:28:02.533755 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.533748 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" event={"ID":"5dbf1909-aeaf-429a-9020-a9384e13a292","Type":"ContainerStarted","Data":"5a3bc35eadcec24deca923b2b13589a511e72889f13cd3675407b142b0783c2c"} Apr 24 21:28:02.533904 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.533762 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" event={"ID":"5dbf1909-aeaf-429a-9020-a9384e13a292","Type":"ContainerStarted","Data":"7fd9d689d89399b150f93803a02f26ab1c10a3852bc9948e3e283a3b1f937edd"} Apr 24 21:28:02.533904 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.533776 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" event={"ID":"5dbf1909-aeaf-429a-9020-a9384e13a292","Type":"ContainerStarted","Data":"646925a76bb4c6608f5a681a297daf171d995e47499006b6fb9affaa11a5c6d6"} Apr 24 21:28:02.533904 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.533788 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" event={"ID":"5dbf1909-aeaf-429a-9020-a9384e13a292","Type":"ContainerStarted","Data":"d7d09888a1ad396addcf9c4a624d147f228568cf4e32afe5e449001a243a4011"} Apr 24 21:28:02.533904 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.533800 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" event={"ID":"5dbf1909-aeaf-429a-9020-a9384e13a292","Type":"ContainerStarted","Data":"ce8ff19707f614b258d74cf7d6dd714492e79ebda91f3c5d97f4bfa947b2b1e5"} Apr 24 21:28:02.535175 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.535151 2567 generic.go:358] "Generic (PLEG): container finished" podID="99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae" containerID="efb4d2db4348c9848cae2ee9bbc0a0c05091b322f580db42c28674a5f90b92b2" exitCode=0 Apr 24 21:28:02.535271 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.535217 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" event={"ID":"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae","Type":"ContainerDied","Data":"efb4d2db4348c9848cae2ee9bbc0a0c05091b322f580db42c28674a5f90b92b2"} Apr 24 21:28:02.536731 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.536706 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2f989" event={"ID":"7e34f57d-6789-43e3-8a4f-f5b55dd1ace1","Type":"ContainerStarted","Data":"d7a6d0e6a3c640776f9b8acbe59ad0da3a1b425d16957a2575a6b08a94c3b1c0"} Apr 24 21:28:02.538211 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.538188 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dzcxv" event={"ID":"54da82a7-17e0-4f28-b08b-90fd402af6ec","Type":"ContainerStarted","Data":"d8ff43e5cffc537b4f1607c50b1e474d09c95b3946a01c36c5a55be280c5bd6f"} Apr 24 21:28:02.539518 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.539496 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-whp9g" event={"ID":"b888a29e-e580-4114-8441-9109c5db53fd","Type":"ContainerStarted","Data":"8d772d39b2938171965b3818a9a31bb88775db1da4861661a147fe9a86574563"} Apr 24 21:28:02.547443 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.547407 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xh68j" podStartSLOduration=4.077770501 podStartE2EDuration="20.547398234s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:27:44.982679837 +0000 UTC m=+3.081092334" lastFinishedPulling="2026-04-24 21:28:01.452307573 +0000 UTC m=+19.550720067" observedRunningTime="2026-04-24 21:28:02.54720275 +0000 UTC m=+20.645615276" watchObservedRunningTime="2026-04-24 21:28:02.547398234 +0000 UTC m=+20.645810746" Apr 24 21:28:02.580020 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.579985 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nzxhn" podStartSLOduration=1.579973622 podStartE2EDuration="1.579973622s" podCreationTimestamp="2026-04-24 21:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:02.563453504 +0000 UTC m=+20.661866017" watchObservedRunningTime="2026-04-24 21:28:02.579973622 +0000 UTC m=+20.678386117" Apr 24 21:28:02.605706 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.605672 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-whp9g" podStartSLOduration=4.102838664 podStartE2EDuration="20.605661135s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:27:44.9751838 +0000 UTC m=+3.073596305" lastFinishedPulling="2026-04-24 21:28:01.478006276 +0000 UTC m=+19.576418776" observedRunningTime="2026-04-24 21:28:02.580429419 +0000 UTC m=+20.678841933" watchObservedRunningTime="2026-04-24 21:28:02.605661135 +0000 UTC m=+20.704073647" Apr 24 21:28:02.605786 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.605736 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2f989" podStartSLOduration=4.0778525 podStartE2EDuration="20.605733171s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:27:44.980830592 +0000 UTC m=+3.079243085" lastFinishedPulling="2026-04-24 21:28:01.508711261 +0000 UTC m=+19.607123756" observedRunningTime="2026-04-24 21:28:02.605636191 +0000 UTC m=+20.704048703" watchObservedRunningTime="2026-04-24 21:28:02.605733171 +0000 UTC m=+20.704145683" Apr 24 21:28:02.622206 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.622169 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dzcxv" podStartSLOduration=4.146336169 podStartE2EDuration="20.622157777s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:27:44.976450879 +0000 UTC m=+3.074863371" lastFinishedPulling="2026-04-24 21:28:01.452272479 +0000 UTC m=+19.550684979" observedRunningTime="2026-04-24 21:28:02.62168431 +0000 UTC m=+20.720096823" watchObservedRunningTime="2026-04-24 21:28:02.622157777 +0000 UTC m=+20.720570287" Apr 24 21:28:02.626507 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:02.626487 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:28:03.368764 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:03.368656 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:28:02.62650502Z","UUID":"2dfaafe5-cab0-436c-9b9b-1be7ac756756","Handler":null,"Name":"","Endpoint":""} Apr 24 21:28:03.371333 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:03.371307 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:28:03.371471 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:03.371345 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:28:03.542976 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:03.542942 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dhdqg" event={"ID":"d61d5422-402f-4ecf-8f78-effcda482ce0","Type":"ContainerStarted","Data":"ca0b1d54f72b12df07b42cb3ecec92c5c9ec391b507a1fa106bffb6bedd8284a"} Apr 24 21:28:03.545284 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:03.545243 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" event={"ID":"f4dbd268-c323-40be-8c10-6478beb5cecc","Type":"ContainerStarted","Data":"2d487de6de3b4bba33779e64e1fd1bba40342c37629779659e52e27baa1e0302"} Apr 24 21:28:03.545284 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:03.545278 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" event={"ID":"f4dbd268-c323-40be-8c10-6478beb5cecc","Type":"ContainerStarted","Data":"4eb4f7226520a28da0ad135542656bba80a4836f37ae74aca1b33be355dd07d6"} Apr 24 21:28:03.559657 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:03.559615 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dhdqg" podStartSLOduration=5.060695233 podStartE2EDuration="21.559603254s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:27:44.979040698 +0000 UTC m=+3.077453192" lastFinishedPulling="2026-04-24 21:28:01.477948718 +0000 UTC m=+19.576361213" observedRunningTime="2026-04-24 21:28:03.559156336 +0000 UTC m=+21.657568849" watchObservedRunningTime="2026-04-24 21:28:03.559603254 +0000 UTC m=+21.658015761" Apr 24 21:28:03.578460 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:03.578350 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wplmf" podStartSLOduration=3.128671808 podStartE2EDuration="21.578334356s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:27:44.976245396 +0000 UTC m=+3.074657894" lastFinishedPulling="2026-04-24 21:28:03.425907941 +0000 UTC m=+21.524320442" observedRunningTime="2026-04-24 21:28:03.577435617 +0000 UTC m=+21.675848164" watchObservedRunningTime="2026-04-24 21:28:03.578334356 +0000 UTC m=+21.676746870" Apr 24 21:28:04.066904 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:04.066872 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:04.067108 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:04.067031 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:04.067108 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:04.067098 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret podName:d846199e-de26-4ab9-80f8-977e44e27d81 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:12.067078452 +0000 UTC m=+30.165490952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret") pod "global-pull-secret-syncer-vwv9h" (UID: "d846199e-de26-4ab9-80f8-977e44e27d81") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:04.413215 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:04.413131 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:04.413717 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:04.413260 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:28:04.413717 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:04.413131 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:04.413717 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:04.413340 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:28:04.413717 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:04.413131 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:04.413717 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:04.413395 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:28:04.550730 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:04.550692 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" event={"ID":"5dbf1909-aeaf-429a-9020-a9384e13a292","Type":"ContainerStarted","Data":"cb9b5403fcf1c30f2b2c126cbb8df3d24ca1d345e3bd77f8c949a8378e898676"} Apr 24 21:28:06.412593 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:06.412461 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:06.412983 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:06.412508 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:06.412983 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:06.412689 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:28:06.412983 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:06.412542 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:06.412983 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:06.412791 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:28:06.412983 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:06.412873 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:28:06.558030 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:06.557695 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" event={"ID":"5dbf1909-aeaf-429a-9020-a9384e13a292","Type":"ContainerStarted","Data":"fa193f5244bcb24a88dd2b50b830622270af02d635b9a810a32032a89aec5588"} Apr 24 21:28:06.559073 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:06.558092 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:28:06.559073 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:06.558121 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:28:06.572945 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:06.572902 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:28:06.601943 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:06.601893 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" podStartSLOduration=7.863166507 podStartE2EDuration="24.601880347s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:27:44.98446852 +0000 UTC m=+3.082881016" lastFinishedPulling="2026-04-24 21:28:01.723182362 +0000 UTC m=+19.821594856" observedRunningTime="2026-04-24 21:28:06.601359219 +0000 UTC m=+24.699771765" watchObservedRunningTime="2026-04-24 21:28:06.601880347 +0000 UTC m=+24.700292860" Apr 24 21:28:07.452869 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:07.452845 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:28:07.453388 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:07.453373 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:28:07.560522 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:07.560493 2567 generic.go:358] "Generic (PLEG): container finished" podID="99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae" containerID="b82880d9aacef72c68bca0b8289959428cc76d736d62b0b51d0507570021d077" exitCode=0 Apr 24 21:28:07.560663 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:07.560574 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" event={"ID":"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae","Type":"ContainerDied","Data":"b82880d9aacef72c68bca0b8289959428cc76d736d62b0b51d0507570021d077"} Apr 24 21:28:07.560830 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:07.560814 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:28:07.561173 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:07.561149 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:28:07.561346 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:07.561333 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dzcxv" Apr 24 21:28:07.575712 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:07.575551 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:28:08.412512 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.412478 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:08.412693 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.412489 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:08.412693 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.412478 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:08.412693 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:08.412621 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:28:08.412693 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:08.412688 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:28:08.412829 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:08.412775 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:28:08.516514 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.516482 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vwv9h"] Apr 24 21:28:08.519115 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.519082 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jjnzv"] Apr 24 21:28:08.519655 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.519636 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x6s8x"] Apr 24 21:28:08.564607 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.564559 2567 generic.go:358] "Generic (PLEG): container finished" podID="99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae" containerID="c19d3b10d77d970dbd8e154184db7119bce54cc6b8d00950dd0f80adb08b708b" exitCode=0 Apr 24 21:28:08.564751 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.564614 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" event={"ID":"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae","Type":"ContainerDied","Data":"c19d3b10d77d970dbd8e154184db7119bce54cc6b8d00950dd0f80adb08b708b"} Apr 24 21:28:08.564751 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.564667 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:08.564751 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.564667 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:08.564855 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:08.564753 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:08.564855 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:08.564834 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:28:08.564970 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:08.564926 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:28:08.565005 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:08.564986 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:28:09.571084 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:09.571004 2567 generic.go:358] "Generic (PLEG): container finished" podID="99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae" containerID="e42d0b13e185d1dd8ddf8981f229b1a3755b1fb907c02ab9cd22ca4441396e8c" exitCode=0 Apr 24 21:28:09.571427 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:09.571095 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" event={"ID":"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae","Type":"ContainerDied","Data":"e42d0b13e185d1dd8ddf8981f229b1a3755b1fb907c02ab9cd22ca4441396e8c"} Apr 24 21:28:10.412703 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:10.412674 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:10.412810 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:10.412746 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:10.412908 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:10.412875 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:10.413033 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:10.412892 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:28:10.413033 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:10.412990 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:28:10.413147 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:10.413066 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:28:12.131866 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:12.131652 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:12.132235 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:12.131758 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:12.132235 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:12.131998 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret podName:d846199e-de26-4ab9-80f8-977e44e27d81 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:28.131975259 +0000 UTC m=+46.230387765 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret") pod "global-pull-secret-syncer-vwv9h" (UID: "d846199e-de26-4ab9-80f8-977e44e27d81") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:12.414026 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:12.413991 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:12.414187 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:12.414103 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:28:12.414187 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:12.414157 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:12.414269 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:12.414233 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:28:12.414374 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:12.414353 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:12.414468 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:12.414448 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:28:14.412534 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.412502 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:14.412989 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.412508 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:14.412989 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:14.412636 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jjnzv" podUID="63ad21c9-3529-456b-b932-b0cb7555c6a5" Apr 24 21:28:14.412989 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:14.412710 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vwv9h" podUID="d846199e-de26-4ab9-80f8-977e44e27d81" Apr 24 21:28:14.412989 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.412517 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:14.412989 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:14.412832 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6s8x" podUID="52f8223b-f29e-4bac-bf1e-475d1a24a90c" Apr 24 21:28:14.762542 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.762514 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-242.ec2.internal" event="NodeReady" Apr 24 21:28:14.762723 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.762671 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:28:14.801384 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.798681 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-79d7dbdd8d-zcbl7"] Apr 24 21:28:14.830101 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.830061 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jhpmf"] Apr 24 21:28:14.830267 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.830188 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:14.833031 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.833008 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6crsc\"" Apr 24 21:28:14.833147 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.833053 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:28:14.833227 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.833007 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:28:14.833288 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.833256 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:28:14.839118 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.838629 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:28:14.848279 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.848261 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xk55r"] Apr 24 21:28:14.848425 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.848401 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:14.850976 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.850819 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:28:14.850976 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.850836 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:28:14.851122 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.850982 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xsnkf\"" Apr 24 21:28:14.863050 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.863031 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79d7dbdd8d-zcbl7"] Apr 24 21:28:14.863125 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.863057 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jhpmf"] Apr 24 21:28:14.863125 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.863069 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xk55r"] Apr 24 21:28:14.863222 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.863161 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:14.865322 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.865304 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ssv7d\"" Apr 24 21:28:14.865416 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.865339 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:28:14.865416 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.865311 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:28:14.865671 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.865648 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:28:14.948963 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.948914 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg4nr\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-kube-api-access-sg4nr\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:14.949128 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949018 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-tmp-dir\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:14.949128 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949056 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9wqt\" (UniqueName: \"kubernetes.io/projected/caa2f14d-7161-4066-9e78-ae036846d1b5-kube-api-access-z9wqt\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:14.949128 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949081 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-installation-pull-secrets\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:14.949128 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949113 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:14.949337 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949234 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-image-registry-private-configuration\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:14.949337 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949270 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-certificates\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:14.949337 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949299 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-bound-sa-token\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:14.949337 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949331 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l47nb\" (UniqueName: \"kubernetes.io/projected/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-kube-api-access-l47nb\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:14.949493 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949418 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-config-volume\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:14.949493 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949447 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:14.949493 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949470 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:14.949493 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949487 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d52bab75-adb9-452b-a696-d309c8c9d8c3-ca-trust-extracted\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:14.949692 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:14.949510 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-trusted-ca\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.050507 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050435 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-trusted-ca\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.050507 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg4nr\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-kube-api-access-sg4nr\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.050507 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050499 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-tmp-dir\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:15.050802 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050515 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9wqt\" (UniqueName: \"kubernetes.io/projected/caa2f14d-7161-4066-9e78-ae036846d1b5-kube-api-access-z9wqt\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:15.050802 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050531 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-installation-pull-secrets\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.050802 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050549 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:15.050802 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050612 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-image-registry-private-configuration\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.050802 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050630 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-certificates\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.050802 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050645 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-bound-sa-token\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.050802 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050673 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l47nb\" (UniqueName: \"kubernetes.io/projected/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-kube-api-access-l47nb\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:15.050802 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050780 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-config-volume\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:15.051053 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050820 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.051053 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.050855 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:15.051053 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050851 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:15.051053 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.050934 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert podName:caa2f14d-7161-4066-9e78-ae036846d1b5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:15.550902939 +0000 UTC m=+33.649315438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert") pod "ingress-canary-xk55r" (UID: "caa2f14d-7161-4066-9e78-ae036846d1b5") : secret "canary-serving-cert" not found Apr 24 21:28:15.051053 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.050959 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d52bab75-adb9-452b-a696-d309c8c9d8c3-ca-trust-extracted\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.051053 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.050979 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:15.051053 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.051038 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls podName:30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:15.551020649 +0000 UTC m=+33.649433159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls") pod "dns-default-jhpmf" (UID: "30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e") : secret "dns-default-metrics-tls" not found Apr 24 21:28:15.051279 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.051096 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:15.051279 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.051106 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79d7dbdd8d-zcbl7: secret "image-registry-tls" not found Apr 24 21:28:15.051279 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.051138 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls podName:d52bab75-adb9-452b-a696-d309c8c9d8c3 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:15.551126266 +0000 UTC m=+33.649538784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls") pod "image-registry-79d7dbdd8d-zcbl7" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3") : secret "image-registry-tls" not found Apr 24 21:28:15.051279 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.051217 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-certificates\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.051279 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.051249 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d52bab75-adb9-452b-a696-d309c8c9d8c3-ca-trust-extracted\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.051576 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.051559 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-trusted-ca\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.054870 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.054849 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-image-registry-private-configuration\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.054870 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.054861 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-installation-pull-secrets\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.059761 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.059735 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-tmp-dir\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:15.060045 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.060023 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-config-volume\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:15.065094 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.065072 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l47nb\" (UniqueName: \"kubernetes.io/projected/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-kube-api-access-l47nb\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:15.065163 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.065115 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg4nr\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-kube-api-access-sg4nr\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.066346 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.066329 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9wqt\" (UniqueName: \"kubernetes.io/projected/caa2f14d-7161-4066-9e78-ae036846d1b5-kube-api-access-z9wqt\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:15.072194 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.072169 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-bound-sa-token\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.554794 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.554760 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:15.555180 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.554802 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:15.555180 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.554896 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:15.555180 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.554901 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:15.555180 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.554931 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79d7dbdd8d-zcbl7: secret "image-registry-tls" not found Apr 24 21:28:15.555180 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.554941 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:15.555180 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.554961 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls podName:30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.554942581 +0000 UTC m=+34.653355101 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls") pod "dns-default-jhpmf" (UID: "30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e") : secret "dns-default-metrics-tls" not found Apr 24 21:28:15.555180 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.554982 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls podName:d52bab75-adb9-452b-a696-d309c8c9d8c3 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.554972405 +0000 UTC m=+34.653384905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls") pod "image-registry-79d7dbdd8d-zcbl7" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3") : secret "image-registry-tls" not found Apr 24 21:28:15.555180 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.555007 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:15.555180 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:15.555055 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert podName:caa2f14d-7161-4066-9e78-ae036846d1b5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.555045961 +0000 UTC m=+34.653458456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert") pod "ingress-canary-xk55r" (UID: "caa2f14d-7161-4066-9e78-ae036846d1b5") : secret "canary-serving-cert" not found Apr 24 21:28:15.583921 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.583893 2567 generic.go:358] "Generic (PLEG): container finished" podID="99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae" containerID="ac79dd56441f0c8e9318a53368d03f06789ca496b17e2f84ca0c983973ce658b" exitCode=0 Apr 24 21:28:15.584039 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:15.583935 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" event={"ID":"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae","Type":"ContainerDied","Data":"ac79dd56441f0c8e9318a53368d03f06789ca496b17e2f84ca0c983973ce658b"} Apr 24 21:28:16.059123 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.059095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:16.059295 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.059270 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:16.059371 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.059360 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs podName:52f8223b-f29e-4bac-bf1e-475d1a24a90c nodeName:}" failed. No retries permitted until 2026-04-24 21:28:48.059339435 +0000 UTC m=+66.157751939 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs") pod "network-metrics-daemon-x6s8x" (UID: "52f8223b-f29e-4bac-bf1e-475d1a24a90c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:16.159855 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.159821 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtz2\" (UniqueName: \"kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2\") pod \"network-check-target-jjnzv\" (UID: \"63ad21c9-3529-456b-b932-b0cb7555c6a5\") " pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:16.160020 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.159991 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:16.160020 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.160016 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:16.160092 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.160027 2567 projected.go:194] Error preparing data for projected volume kube-api-access-mdtz2 for pod openshift-network-diagnostics/network-check-target-jjnzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:16.160092 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.160082 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2 podName:63ad21c9-3529-456b-b932-b0cb7555c6a5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:48.160068028 +0000 UTC m=+66.258480519 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mdtz2" (UniqueName: "kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2") pod "network-check-target-jjnzv" (UID: "63ad21c9-3529-456b-b932-b0cb7555c6a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:16.413239 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.413210 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:16.413399 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.413243 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:16.413399 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.413395 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:16.416751 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.416728 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:16.416856 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.416754 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-q7cg2\"" Apr 24 21:28:16.417421 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.417403 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:16.417760 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.417739 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:28:16.417838 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.417824 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:16.418211 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.418196 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7528m\"" Apr 24 21:28:16.562644 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.562618 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:16.563072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.562721 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:16.563072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.562757 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:16.563072 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.562766 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:16.563072 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.562836 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert podName:caa2f14d-7161-4066-9e78-ae036846d1b5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:18.562817301 +0000 UTC m=+36.661229791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert") pod "ingress-canary-xk55r" (UID: "caa2f14d-7161-4066-9e78-ae036846d1b5") : secret "canary-serving-cert" not found Apr 24 21:28:16.563072 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.562854 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:16.563072 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.562903 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls podName:30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:18.562887824 +0000 UTC m=+36.661300332 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls") pod "dns-default-jhpmf" (UID: "30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e") : secret "dns-default-metrics-tls" not found Apr 24 21:28:16.563072 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.562945 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:16.563072 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.562957 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79d7dbdd8d-zcbl7: secret "image-registry-tls" not found Apr 24 21:28:16.563072 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:16.562989 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls podName:d52bab75-adb9-452b-a696-d309c8c9d8c3 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:18.562981139 +0000 UTC m=+36.661393635 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls") pod "image-registry-79d7dbdd8d-zcbl7" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3") : secret "image-registry-tls" not found Apr 24 21:28:16.587904 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.587878 2567 generic.go:358] "Generic (PLEG): container finished" podID="99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae" containerID="3fe89c1f5517032f6abdf6f7d96a0db2a332b9ce05a004880d1fa8d91726637b" exitCode=0 Apr 24 21:28:16.588037 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.587926 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" event={"ID":"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae","Type":"ContainerDied","Data":"3fe89c1f5517032f6abdf6f7d96a0db2a332b9ce05a004880d1fa8d91726637b"} Apr 24 21:28:16.749589 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.749424 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr"] Apr 24 21:28:16.771842 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.771817 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr"] Apr 24 21:28:16.771961 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.771911 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr" Apr 24 21:28:16.774476 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.774457 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:28:16.774598 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.774475 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:16.774718 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.774699 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-cjgzh\"" Apr 24 21:28:16.866510 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.866483 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnlx9\" (UniqueName: \"kubernetes.io/projected/274f17a0-ae47-42da-bc42-494b1f9aeed6-kube-api-access-gnlx9\") pod \"migrator-74bb7799d9-ss8cr\" (UID: \"274f17a0-ae47-42da-bc42-494b1f9aeed6\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr" Apr 24 21:28:16.967631 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.967553 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnlx9\" (UniqueName: \"kubernetes.io/projected/274f17a0-ae47-42da-bc42-494b1f9aeed6-kube-api-access-gnlx9\") pod \"migrator-74bb7799d9-ss8cr\" (UID: \"274f17a0-ae47-42da-bc42-494b1f9aeed6\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr" Apr 24 21:28:16.978258 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:16.978230 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnlx9\" (UniqueName: \"kubernetes.io/projected/274f17a0-ae47-42da-bc42-494b1f9aeed6-kube-api-access-gnlx9\") pod \"migrator-74bb7799d9-ss8cr\" (UID: \"274f17a0-ae47-42da-bc42-494b1f9aeed6\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr" Apr 24 21:28:17.080862 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.080830 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr" Apr 24 21:28:17.193325 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.193287 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nzxhn_6c3b2011-26ac-4109-9898-ad37c3d322dd/dns-node-resolver/0.log" Apr 24 21:28:17.259961 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.259812 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr"] Apr 24 21:28:17.262988 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:17.262962 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod274f17a0_ae47_42da_bc42_494b1f9aeed6.slice/crio-abc0e8c2f7e7104c94823e74796ddd1a645d6b95879115e493f3507eba5807e7 WatchSource:0}: Error finding container abc0e8c2f7e7104c94823e74796ddd1a645d6b95879115e493f3507eba5807e7: Status 404 returned error can't find the container with id abc0e8c2f7e7104c94823e74796ddd1a645d6b95879115e493f3507eba5807e7 Apr 24 21:28:17.590380 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.590298 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr" event={"ID":"274f17a0-ae47-42da-bc42-494b1f9aeed6","Type":"ContainerStarted","Data":"abc0e8c2f7e7104c94823e74796ddd1a645d6b95879115e493f3507eba5807e7"} Apr 24 21:28:17.592966 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.592934 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" event={"ID":"99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae","Type":"ContainerStarted","Data":"4fcaefec43cffeb33383fa3faee4d9adf658fb3d67f282eae09bae20fabaf738"} Apr 24 21:28:17.623543 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.623493 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x5xkp" podStartSLOduration=5.344023114 podStartE2EDuration="35.623475069s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:27:44.983518169 +0000 UTC m=+3.081930668" lastFinishedPulling="2026-04-24 21:28:15.262970119 +0000 UTC m=+33.361382623" observedRunningTime="2026-04-24 21:28:17.623258166 +0000 UTC m=+35.721670691" watchObservedRunningTime="2026-04-24 21:28:17.623475069 +0000 UTC m=+35.721887584" Apr 24 21:28:17.813156 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.813127 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-sxbgk"] Apr 24 21:28:17.840845 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.840784 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sxbgk"] Apr 24 21:28:17.840973 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.840900 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:17.843561 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.843542 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qk948\"" Apr 24 21:28:17.843701 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.843541 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:28:17.843701 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.843621 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:28:17.843701 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.843682 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:28:17.843853 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.843723 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:28:17.975354 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.975321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2l9c\" (UniqueName: \"kubernetes.io/projected/dfef123c-0aff-4e29-9992-a01cd35408cb-kube-api-access-v2l9c\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:17.975536 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.975372 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dfef123c-0aff-4e29-9992-a01cd35408cb-data-volume\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:17.975536 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.975411 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:17.975536 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.975437 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dfef123c-0aff-4e29-9992-a01cd35408cb-crio-socket\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:17.975733 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:17.975599 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dfef123c-0aff-4e29-9992-a01cd35408cb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.076693 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.076663 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.076852 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.076701 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dfef123c-0aff-4e29-9992-a01cd35408cb-crio-socket\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.076852 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.076782 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dfef123c-0aff-4e29-9992-a01cd35408cb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.076852 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.076810 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2l9c\" (UniqueName: \"kubernetes.io/projected/dfef123c-0aff-4e29-9992-a01cd35408cb-kube-api-access-v2l9c\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.076852 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.076830 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:18.077053 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.076859 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dfef123c-0aff-4e29-9992-a01cd35408cb-data-volume\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.077053 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.076904 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls podName:dfef123c-0aff-4e29-9992-a01cd35408cb nodeName:}" failed. No retries permitted until 2026-04-24 21:28:18.576884185 +0000 UTC m=+36.675296691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-sxbgk" (UID: "dfef123c-0aff-4e29-9992-a01cd35408cb") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:18.077377 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.077192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dfef123c-0aff-4e29-9992-a01cd35408cb-crio-socket\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.077487 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.077203 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dfef123c-0aff-4e29-9992-a01cd35408cb-data-volume\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.077619 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.077602 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dfef123c-0aff-4e29-9992-a01cd35408cb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.108808 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.108727 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2l9c\" (UniqueName: \"kubernetes.io/projected/dfef123c-0aff-4e29-9992-a01cd35408cb-kube-api-access-v2l9c\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.380335 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.380265 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-whp9g_b888a29e-e580-4114-8441-9109c5db53fd/node-ca/0.log" Apr 24 21:28:18.434239 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.434204 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-phvc9"] Apr 24 21:28:18.446442 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.446409 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-phvc9"] Apr 24 21:28:18.446627 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.446550 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.449220 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.449195 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-c4tsx\"" Apr 24 21:28:18.449343 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.449305 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:28:18.449416 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.449355 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:28:18.450403 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.450182 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:28:18.450403 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.450252 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:28:18.580668 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.580637 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/36e66999-e272-4d1a-a072-7afed7299326-signing-cabundle\") pod \"service-ca-865cb79987-phvc9\" (UID: \"36e66999-e272-4d1a-a072-7afed7299326\") " pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.580847 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.580695 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:18.580847 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.580737 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:18.580847 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.580760 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbcjt\" (UniqueName: \"kubernetes.io/projected/36e66999-e272-4d1a-a072-7afed7299326-kube-api-access-bbcjt\") pod \"service-ca-865cb79987-phvc9\" (UID: \"36e66999-e272-4d1a-a072-7afed7299326\") " pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.580847 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.580791 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:18.580847 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.580837 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:18.581062 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.580858 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79d7dbdd8d-zcbl7: secret "image-registry-tls" not found Apr 24 21:28:18.581062 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.580866 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:18.581062 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.580926 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls podName:30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:22.580901917 +0000 UTC m=+40.679314408 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls") pod "dns-default-jhpmf" (UID: "30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e") : secret "dns-default-metrics-tls" not found Apr 24 21:28:18.581062 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.580945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:18.581062 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.580955 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:18.581062 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.580969 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/36e66999-e272-4d1a-a072-7afed7299326-signing-key\") pod \"service-ca-865cb79987-phvc9\" (UID: \"36e66999-e272-4d1a-a072-7afed7299326\") " pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.581062 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.581000 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls podName:d52bab75-adb9-452b-a696-d309c8c9d8c3 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:22.580980063 +0000 UTC m=+40.679392557 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls") pod "image-registry-79d7dbdd8d-zcbl7" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3") : secret "image-registry-tls" not found Apr 24 21:28:18.581062 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.581015 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:18.581062 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.581019 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert podName:caa2f14d-7161-4066-9e78-ae036846d1b5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:22.581011088 +0000 UTC m=+40.679423580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert") pod "ingress-canary-xk55r" (UID: "caa2f14d-7161-4066-9e78-ae036846d1b5") : secret "canary-serving-cert" not found Apr 24 21:28:18.581062 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:18.581056 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls podName:dfef123c-0aff-4e29-9992-a01cd35408cb nodeName:}" failed. No retries permitted until 2026-04-24 21:28:19.581037478 +0000 UTC m=+37.679449969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-sxbgk" (UID: "dfef123c-0aff-4e29-9992-a01cd35408cb") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:18.682141 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.681894 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/36e66999-e272-4d1a-a072-7afed7299326-signing-key\") pod \"service-ca-865cb79987-phvc9\" (UID: \"36e66999-e272-4d1a-a072-7afed7299326\") " pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.682141 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.681960 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/36e66999-e272-4d1a-a072-7afed7299326-signing-cabundle\") pod \"service-ca-865cb79987-phvc9\" (UID: \"36e66999-e272-4d1a-a072-7afed7299326\") " pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.682535 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.682292 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbcjt\" (UniqueName: \"kubernetes.io/projected/36e66999-e272-4d1a-a072-7afed7299326-kube-api-access-bbcjt\") pod \"service-ca-865cb79987-phvc9\" (UID: \"36e66999-e272-4d1a-a072-7afed7299326\") " pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.684850 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.682753 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/36e66999-e272-4d1a-a072-7afed7299326-signing-cabundle\") pod \"service-ca-865cb79987-phvc9\" (UID: \"36e66999-e272-4d1a-a072-7afed7299326\") " pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.685692 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.685675 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/36e66999-e272-4d1a-a072-7afed7299326-signing-key\") pod \"service-ca-865cb79987-phvc9\" (UID: \"36e66999-e272-4d1a-a072-7afed7299326\") " pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.692199 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.692120 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbcjt\" (UniqueName: \"kubernetes.io/projected/36e66999-e272-4d1a-a072-7afed7299326-kube-api-access-bbcjt\") pod \"service-ca-865cb79987-phvc9\" (UID: \"36e66999-e272-4d1a-a072-7afed7299326\") " pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.758313 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.758292 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-phvc9" Apr 24 21:28:18.893784 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:18.893757 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-phvc9"] Apr 24 21:28:18.897167 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:18.897145 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e66999_e272_4d1a_a072_7afed7299326.slice/crio-1a2bd17133552e0699730d0c095e5d33c24f79d0e718450d7de27e7f9ecb4001 WatchSource:0}: Error finding container 1a2bd17133552e0699730d0c095e5d33c24f79d0e718450d7de27e7f9ecb4001: Status 404 returned error can't find the container with id 1a2bd17133552e0699730d0c095e5d33c24f79d0e718450d7de27e7f9ecb4001 Apr 24 21:28:19.589823 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:19.589786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:19.590037 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:19.589950 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:19.590037 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:19.590019 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls podName:dfef123c-0aff-4e29-9992-a01cd35408cb nodeName:}" failed. No retries permitted until 2026-04-24 21:28:21.589999422 +0000 UTC m=+39.688411931 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-sxbgk" (UID: "dfef123c-0aff-4e29-9992-a01cd35408cb") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:19.598110 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:19.598080 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr" event={"ID":"274f17a0-ae47-42da-bc42-494b1f9aeed6","Type":"ContainerStarted","Data":"88d2a2e143ac7294f53f8946ffc507548f9efd5daafe982173d5d5d115ebfadc"} Apr 24 21:28:19.598246 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:19.598116 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr" event={"ID":"274f17a0-ae47-42da-bc42-494b1f9aeed6","Type":"ContainerStarted","Data":"709f952d0ba290cab58cdf4756f995afb7c12c5d90a98c9159882993d2160082"} Apr 24 21:28:19.599263 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:19.599241 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-phvc9" event={"ID":"36e66999-e272-4d1a-a072-7afed7299326","Type":"ContainerStarted","Data":"1a2bd17133552e0699730d0c095e5d33c24f79d0e718450d7de27e7f9ecb4001"} Apr 24 21:28:19.620680 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:19.620627 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ss8cr" podStartSLOduration=2.21145784 podStartE2EDuration="3.62061361s" podCreationTimestamp="2026-04-24 21:28:16 +0000 UTC" firstStartedPulling="2026-04-24 21:28:17.265096987 +0000 UTC m=+35.363509485" lastFinishedPulling="2026-04-24 21:28:18.674252761 +0000 UTC m=+36.772665255" observedRunningTime="2026-04-24 21:28:19.61972262 +0000 UTC m=+37.718135134" watchObservedRunningTime="2026-04-24 21:28:19.62061361 +0000 UTC m=+37.719026122" Apr 24 21:28:21.604786 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:21.604753 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:21.605205 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:21.604921 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:21.605205 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:21.604995 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls podName:dfef123c-0aff-4e29-9992-a01cd35408cb nodeName:}" failed. No retries permitted until 2026-04-24 21:28:25.604975868 +0000 UTC m=+43.703388372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-sxbgk" (UID: "dfef123c-0aff-4e29-9992-a01cd35408cb") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:21.605365 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:21.605320 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-phvc9" event={"ID":"36e66999-e272-4d1a-a072-7afed7299326","Type":"ContainerStarted","Data":"c0c02df120ba1f49cfc1a3a3ae933a449ce681d09ad02ab3386b80f7792239af"} Apr 24 21:28:21.624233 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:21.624185 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-phvc9" podStartSLOduration=1.466476021 podStartE2EDuration="3.624172492s" podCreationTimestamp="2026-04-24 21:28:18 +0000 UTC" firstStartedPulling="2026-04-24 21:28:18.899253418 +0000 UTC m=+36.997665913" lastFinishedPulling="2026-04-24 21:28:21.056949878 +0000 UTC m=+39.155362384" observedRunningTime="2026-04-24 21:28:21.62371612 +0000 UTC m=+39.722128632" watchObservedRunningTime="2026-04-24 21:28:21.624172492 +0000 UTC m=+39.722585005" Apr 24 21:28:22.613446 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:22.613416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:22.613928 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:22.613525 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:22.613928 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:22.613575 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:22.613928 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:22.613612 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79d7dbdd8d-zcbl7: secret "image-registry-tls" not found Apr 24 21:28:22.613928 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:22.613619 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:22.613928 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:22.613671 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls podName:d52bab75-adb9-452b-a696-d309c8c9d8c3 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:30.613650366 +0000 UTC m=+48.712062864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls") pod "image-registry-79d7dbdd8d-zcbl7" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3") : secret "image-registry-tls" not found Apr 24 21:28:22.613928 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:22.613730 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:22.613928 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:22.613782 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert podName:caa2f14d-7161-4066-9e78-ae036846d1b5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:30.61376475 +0000 UTC m=+48.712177241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert") pod "ingress-canary-xk55r" (UID: "caa2f14d-7161-4066-9e78-ae036846d1b5") : secret "canary-serving-cert" not found Apr 24 21:28:22.614302 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:22.614023 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:22.614302 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:22.614058 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls podName:30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:30.614046635 +0000 UTC m=+48.712459128 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls") pod "dns-default-jhpmf" (UID: "30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e") : secret "dns-default-metrics-tls" not found Apr 24 21:28:25.633746 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:25.633710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:25.634107 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:25.633849 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:25.634107 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:25.633910 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls podName:dfef123c-0aff-4e29-9992-a01cd35408cb nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.633891961 +0000 UTC m=+51.732304453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-sxbgk" (UID: "dfef123c-0aff-4e29-9992-a01cd35408cb") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:28.151540 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:28.151506 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:28.154243 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:28.154218 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d846199e-de26-4ab9-80f8-977e44e27d81-original-pull-secret\") pod \"global-pull-secret-syncer-vwv9h\" (UID: \"d846199e-de26-4ab9-80f8-977e44e27d81\") " pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:28.428714 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:28.428682 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vwv9h" Apr 24 21:28:28.544126 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:28.544096 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vwv9h"] Apr 24 21:28:28.625035 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:28.624992 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vwv9h" event={"ID":"d846199e-de26-4ab9-80f8-977e44e27d81","Type":"ContainerStarted","Data":"b2285fb4b8d5acd6d9a338394fce6765abd19b994ae4b66256273ce97ba9e0fd"} Apr 24 21:28:30.676706 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.676666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:30.677184 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.676791 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:30.677184 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.676821 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:30.679467 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.679437 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e-metrics-tls\") pod \"dns-default-jhpmf\" (UID: \"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e\") " pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:30.679618 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.679483 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls\") pod \"image-registry-79d7dbdd8d-zcbl7\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:30.691264 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.691205 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caa2f14d-7161-4066-9e78-ae036846d1b5-cert\") pod \"ingress-canary-xk55r\" (UID: \"caa2f14d-7161-4066-9e78-ae036846d1b5\") " pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:30.741773 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.741738 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:30.757615 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.757576 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:30.772566 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.772541 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xk55r" Apr 24 21:28:30.889746 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.889719 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79d7dbdd8d-zcbl7"] Apr 24 21:28:30.906695 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.906669 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jhpmf"] Apr 24 21:28:30.936963 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:30.936899 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xk55r"] Apr 24 21:28:32.317750 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:32.317706 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd52bab75_adb9_452b_a696_d309c8c9d8c3.slice/crio-b652e7ae5b7fef3c30809bea07053586ecb8247673961834273586bebe19b0c6 WatchSource:0}: Error finding container b652e7ae5b7fef3c30809bea07053586ecb8247673961834273586bebe19b0c6: Status 404 returned error can't find the container with id b652e7ae5b7fef3c30809bea07053586ecb8247673961834273586bebe19b0c6 Apr 24 21:28:32.318688 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:32.318665 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a0f1a2_0d31_4a4e_9ed8_1d20eae0f62e.slice/crio-7e4ee733a922353da95ba2682703a69a3e056f1b95cc7e9fe684ade2b6677443 WatchSource:0}: Error finding container 7e4ee733a922353da95ba2682703a69a3e056f1b95cc7e9fe684ade2b6677443: Status 404 returned error can't find the container with id 7e4ee733a922353da95ba2682703a69a3e056f1b95cc7e9fe684ade2b6677443 Apr 24 21:28:32.319277 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:32.319252 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaa2f14d_7161_4066_9e78_ae036846d1b5.slice/crio-230aebae5fb2caaedd082ce55dde2c45d33f6d3f78ccc782eb693c0d17d60cf0 WatchSource:0}: Error finding container 230aebae5fb2caaedd082ce55dde2c45d33f6d3f78ccc782eb693c0d17d60cf0: Status 404 returned error can't find the container with id 230aebae5fb2caaedd082ce55dde2c45d33f6d3f78ccc782eb693c0d17d60cf0 Apr 24 21:28:32.635315 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:32.635231 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xk55r" event={"ID":"caa2f14d-7161-4066-9e78-ae036846d1b5","Type":"ContainerStarted","Data":"230aebae5fb2caaedd082ce55dde2c45d33f6d3f78ccc782eb693c0d17d60cf0"} Apr 24 21:28:32.636569 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:32.636541 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" event={"ID":"d52bab75-adb9-452b-a696-d309c8c9d8c3","Type":"ContainerStarted","Data":"4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a"} Apr 24 21:28:32.636718 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:32.636574 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" event={"ID":"d52bab75-adb9-452b-a696-d309c8c9d8c3","Type":"ContainerStarted","Data":"b652e7ae5b7fef3c30809bea07053586ecb8247673961834273586bebe19b0c6"} Apr 24 21:28:32.636771 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:32.636722 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:32.637621 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:32.637602 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jhpmf" event={"ID":"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e","Type":"ContainerStarted","Data":"7e4ee733a922353da95ba2682703a69a3e056f1b95cc7e9fe684ade2b6677443"} Apr 24 21:28:33.641112 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:33.641057 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vwv9h" event={"ID":"d846199e-de26-4ab9-80f8-977e44e27d81","Type":"ContainerStarted","Data":"2b49073077c201939d1220a43e8f8cf9a68b4192d82c18d5932e9861f2e25353"} Apr 24 21:28:33.657839 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:33.657791 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" podStartSLOduration=33.657773195 podStartE2EDuration="33.657773195s" podCreationTimestamp="2026-04-24 21:28:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:32.663205292 +0000 UTC m=+50.761617806" watchObservedRunningTime="2026-04-24 21:28:33.657773195 +0000 UTC m=+51.756185708" Apr 24 21:28:33.658992 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:33.658954 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vwv9h" podStartSLOduration=33.563707109 podStartE2EDuration="37.658940371s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:28:28.54927478 +0000 UTC m=+46.647687270" lastFinishedPulling="2026-04-24 21:28:32.64450804 +0000 UTC m=+50.742920532" observedRunningTime="2026-04-24 21:28:33.657710255 +0000 UTC m=+51.756122773" watchObservedRunningTime="2026-04-24 21:28:33.658940371 +0000 UTC m=+51.757352883" Apr 24 21:28:33.701435 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:33.701322 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:33.704188 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:33.704163 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfef123c-0aff-4e29-9992-a01cd35408cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sxbgk\" (UID: \"dfef123c-0aff-4e29-9992-a01cd35408cb\") " pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:33.750316 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:33.750287 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sxbgk" Apr 24 21:28:34.410003 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:34.409965 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sxbgk"] Apr 24 21:28:34.413205 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:34.413173 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfef123c_0aff_4e29_9992_a01cd35408cb.slice/crio-da5783e87a0578a1fa401088f96642d859fd7531575e9bc0e13c181116e22dc2 WatchSource:0}: Error finding container da5783e87a0578a1fa401088f96642d859fd7531575e9bc0e13c181116e22dc2: Status 404 returned error can't find the container with id da5783e87a0578a1fa401088f96642d859fd7531575e9bc0e13c181116e22dc2 Apr 24 21:28:34.645372 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:34.645334 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jhpmf" event={"ID":"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e","Type":"ContainerStarted","Data":"85762eafcb3cc3b4a9f99074fb1a1826f40c07f946b48cc2b8997ab4be4149c2"} Apr 24 21:28:34.645372 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:34.645371 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jhpmf" event={"ID":"30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e","Type":"ContainerStarted","Data":"c74da21353e1210a1545af18e3bf9feebf92404a1f4bba3c35065c973e008b74"} Apr 24 21:28:34.645913 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:34.645437 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:34.650116 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:34.650089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sxbgk" event={"ID":"dfef123c-0aff-4e29-9992-a01cd35408cb","Type":"ContainerStarted","Data":"718cbf4b2fdb97daccabb7e0e42e0370afdec040748b6d5a1567a89d68ff8a8d"} Apr 24 21:28:34.650246 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:34.650120 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sxbgk" event={"ID":"dfef123c-0aff-4e29-9992-a01cd35408cb","Type":"ContainerStarted","Data":"da5783e87a0578a1fa401088f96642d859fd7531575e9bc0e13c181116e22dc2"} Apr 24 21:28:34.651293 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:34.651269 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xk55r" event={"ID":"caa2f14d-7161-4066-9e78-ae036846d1b5","Type":"ContainerStarted","Data":"b8384f41b45e5e72c73827b1846b09313216482d0235d95f0fa1f3fe2259e2f7"} Apr 24 21:28:34.663079 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:34.663042 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jhpmf" podStartSLOduration=18.706379078 podStartE2EDuration="20.663030433s" podCreationTimestamp="2026-04-24 21:28:14 +0000 UTC" firstStartedPulling="2026-04-24 21:28:32.320817682 +0000 UTC m=+50.419230177" lastFinishedPulling="2026-04-24 21:28:34.277469027 +0000 UTC m=+52.375881532" observedRunningTime="2026-04-24 21:28:34.66242976 +0000 UTC m=+52.760842278" watchObservedRunningTime="2026-04-24 21:28:34.663030433 +0000 UTC m=+52.761442945" Apr 24 21:28:34.683708 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:34.683665 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xk55r" podStartSLOduration=18.722564504 podStartE2EDuration="20.683653753s" podCreationTimestamp="2026-04-24 21:28:14 +0000 UTC" firstStartedPulling="2026-04-24 21:28:32.32123067 +0000 UTC m=+50.419643164" lastFinishedPulling="2026-04-24 21:28:34.282319909 +0000 UTC m=+52.380732413" observedRunningTime="2026-04-24 21:28:34.683036603 +0000 UTC m=+52.781449116" watchObservedRunningTime="2026-04-24 21:28:34.683653753 +0000 UTC m=+52.782066266" Apr 24 21:28:35.655635 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:35.655595 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sxbgk" event={"ID":"dfef123c-0aff-4e29-9992-a01cd35408cb","Type":"ContainerStarted","Data":"b2345b53ec913fe161ee4ed9e6d05c4b6c90e9fb932b2387925a46797dc80410"} Apr 24 21:28:36.659067 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:36.659035 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sxbgk" event={"ID":"dfef123c-0aff-4e29-9992-a01cd35408cb","Type":"ContainerStarted","Data":"16fc90e9e5351ea8d38bd9ca8068a1d94ef7a743ebba1a543768f1bc754d66a3"} Apr 24 21:28:36.680031 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:36.679988 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-sxbgk" podStartSLOduration=17.86183827 podStartE2EDuration="19.679974815s" podCreationTimestamp="2026-04-24 21:28:17 +0000 UTC" firstStartedPulling="2026-04-24 21:28:34.485645503 +0000 UTC m=+52.584057999" lastFinishedPulling="2026-04-24 21:28:36.30378205 +0000 UTC m=+54.402194544" observedRunningTime="2026-04-24 21:28:36.678296764 +0000 UTC m=+54.776709276" watchObservedRunningTime="2026-04-24 21:28:36.679974815 +0000 UTC m=+54.778387328" Apr 24 21:28:37.276098 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.276063 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2"] Apr 24 21:28:37.279058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.279042 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" Apr 24 21:28:37.288959 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.288940 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:28:37.289923 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.289904 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-mwqgg\"" Apr 24 21:28:37.296392 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.296364 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2"] Apr 24 21:28:37.298928 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.298903 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:28:37.335730 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.335701 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/86168a3b-d0b9-42cf-8c50-cfdab95abfc9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m7fp2\" (UID: \"86168a3b-d0b9-42cf-8c50-cfdab95abfc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" Apr 24 21:28:37.335730 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.335740 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/86168a3b-d0b9-42cf-8c50-cfdab95abfc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7fp2\" (UID: \"86168a3b-d0b9-42cf-8c50-cfdab95abfc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" Apr 24 21:28:37.363472 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.363438 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-79d7dbdd8d-zcbl7"] Apr 24 21:28:37.379227 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.379200 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-c6p8l"] Apr 24 21:28:37.382122 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.382102 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-c6p8l" Apr 24 21:28:37.391829 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.391808 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-lhpk2\"" Apr 24 21:28:37.392437 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.392424 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:28:37.392946 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.392934 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:28:37.398443 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.398422 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-c6p8l"] Apr 24 21:28:37.437251 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.437208 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mpf4\" (UniqueName: \"kubernetes.io/projected/00358327-c969-49bb-a961-6bbb496280a7-kube-api-access-8mpf4\") pod \"downloads-6bcc868b7-c6p8l\" (UID: \"00358327-c969-49bb-a961-6bbb496280a7\") " pod="openshift-console/downloads-6bcc868b7-c6p8l" Apr 24 21:28:37.437529 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.437507 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/86168a3b-d0b9-42cf-8c50-cfdab95abfc9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m7fp2\" (UID: \"86168a3b-d0b9-42cf-8c50-cfdab95abfc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" Apr 24 21:28:37.437690 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.437671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/86168a3b-d0b9-42cf-8c50-cfdab95abfc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7fp2\" (UID: \"86168a3b-d0b9-42cf-8c50-cfdab95abfc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" Apr 24 21:28:37.438374 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.438347 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/86168a3b-d0b9-42cf-8c50-cfdab95abfc9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m7fp2\" (UID: \"86168a3b-d0b9-42cf-8c50-cfdab95abfc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" Apr 24 21:28:37.441371 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.441344 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/86168a3b-d0b9-42cf-8c50-cfdab95abfc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7fp2\" (UID: \"86168a3b-d0b9-42cf-8c50-cfdab95abfc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" Apr 24 21:28:37.471631 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.471610 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7584f55f55-p88k5"] Apr 24 21:28:37.474960 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.474944 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.497950 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.497929 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7584f55f55-p88k5"] Apr 24 21:28:37.538294 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.538238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mpf4\" (UniqueName: \"kubernetes.io/projected/00358327-c969-49bb-a961-6bbb496280a7-kube-api-access-8mpf4\") pod \"downloads-6bcc868b7-c6p8l\" (UID: \"00358327-c969-49bb-a961-6bbb496280a7\") " pod="openshift-console/downloads-6bcc868b7-c6p8l" Apr 24 21:28:37.538294 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.538271 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad19df44-5b36-4c1c-a39f-a3779df1e10a-bound-sa-token\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.538427 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.538296 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad19df44-5b36-4c1c-a39f-a3779df1e10a-image-registry-private-configuration\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.538427 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.538386 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdjt5\" (UniqueName: \"kubernetes.io/projected/ad19df44-5b36-4c1c-a39f-a3779df1e10a-kube-api-access-mdjt5\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.538427 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.538415 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad19df44-5b36-4c1c-a39f-a3779df1e10a-registry-tls\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.538538 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.538440 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad19df44-5b36-4c1c-a39f-a3779df1e10a-registry-certificates\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.538538 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.538457 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad19df44-5b36-4c1c-a39f-a3779df1e10a-ca-trust-extracted\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.538538 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.538523 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad19df44-5b36-4c1c-a39f-a3779df1e10a-trusted-ca\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.538675 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.538565 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad19df44-5b36-4c1c-a39f-a3779df1e10a-installation-pull-secrets\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.575678 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.575655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mpf4\" (UniqueName: \"kubernetes.io/projected/00358327-c969-49bb-a961-6bbb496280a7-kube-api-access-8mpf4\") pod \"downloads-6bcc868b7-c6p8l\" (UID: \"00358327-c969-49bb-a961-6bbb496280a7\") " pod="openshift-console/downloads-6bcc868b7-c6p8l" Apr 24 21:28:37.587410 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.587392 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" Apr 24 21:28:37.639366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.639330 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad19df44-5b36-4c1c-a39f-a3779df1e10a-bound-sa-token\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.639498 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.639373 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad19df44-5b36-4c1c-a39f-a3779df1e10a-image-registry-private-configuration\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.639597 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.639557 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdjt5\" (UniqueName: \"kubernetes.io/projected/ad19df44-5b36-4c1c-a39f-a3779df1e10a-kube-api-access-mdjt5\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.639671 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.639618 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad19df44-5b36-4c1c-a39f-a3779df1e10a-registry-tls\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.639723 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.639666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad19df44-5b36-4c1c-a39f-a3779df1e10a-registry-certificates\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.639723 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.639692 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad19df44-5b36-4c1c-a39f-a3779df1e10a-ca-trust-extracted\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.639723 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.639720 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad19df44-5b36-4c1c-a39f-a3779df1e10a-trusted-ca\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.639867 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.639755 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad19df44-5b36-4c1c-a39f-a3779df1e10a-installation-pull-secrets\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.641385 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.640467 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad19df44-5b36-4c1c-a39f-a3779df1e10a-ca-trust-extracted\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.641385 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.640770 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad19df44-5b36-4c1c-a39f-a3779df1e10a-registry-certificates\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.641385 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.641023 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad19df44-5b36-4c1c-a39f-a3779df1e10a-trusted-ca\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.642308 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.642132 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ad19df44-5b36-4c1c-a39f-a3779df1e10a-image-registry-private-configuration\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.643143 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.643122 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad19df44-5b36-4c1c-a39f-a3779df1e10a-registry-tls\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.643418 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.643392 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad19df44-5b36-4c1c-a39f-a3779df1e10a-installation-pull-secrets\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.660759 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.660709 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdjt5\" (UniqueName: \"kubernetes.io/projected/ad19df44-5b36-4c1c-a39f-a3779df1e10a-kube-api-access-mdjt5\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.677065 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.677042 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad19df44-5b36-4c1c-a39f-a3779df1e10a-bound-sa-token\") pod \"image-registry-7584f55f55-p88k5\" (UID: \"ad19df44-5b36-4c1c-a39f-a3779df1e10a\") " pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.689900 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.689880 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-c6p8l" Apr 24 21:28:37.728251 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.728220 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2"] Apr 24 21:28:37.732841 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:37.732818 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86168a3b_d0b9_42cf_8c50_cfdab95abfc9.slice/crio-a987d7f0c080777b8a67da2c45a4980602761a3f371f1eca3b41468169382aeb WatchSource:0}: Error finding container a987d7f0c080777b8a67da2c45a4980602761a3f371f1eca3b41468169382aeb: Status 404 returned error can't find the container with id a987d7f0c080777b8a67da2c45a4980602761a3f371f1eca3b41468169382aeb Apr 24 21:28:37.783455 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.783430 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:37.811229 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.811199 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-c6p8l"] Apr 24 21:28:37.814619 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:37.814576 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00358327_c969_49bb_a961_6bbb496280a7.slice/crio-4d0122b7075cfe14bc4402d8de7e8610e51fa5b75bf7d2e6deeccbb4e23ef0d2 WatchSource:0}: Error finding container 4d0122b7075cfe14bc4402d8de7e8610e51fa5b75bf7d2e6deeccbb4e23ef0d2: Status 404 returned error can't find the container with id 4d0122b7075cfe14bc4402d8de7e8610e51fa5b75bf7d2e6deeccbb4e23ef0d2 Apr 24 21:28:37.909025 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:37.908992 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7584f55f55-p88k5"] Apr 24 21:28:37.913685 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:37.913657 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad19df44_5b36_4c1c_a39f_a3779df1e10a.slice/crio-24ad622eb356845b658fa29dac9df7f22dce5a58d135f139c6557a67d8bcc0b8 WatchSource:0}: Error finding container 24ad622eb356845b658fa29dac9df7f22dce5a58d135f139c6557a67d8bcc0b8: Status 404 returned error can't find the container with id 24ad622eb356845b658fa29dac9df7f22dce5a58d135f139c6557a67d8bcc0b8 Apr 24 21:28:38.667048 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:38.667008 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-c6p8l" event={"ID":"00358327-c969-49bb-a961-6bbb496280a7","Type":"ContainerStarted","Data":"4d0122b7075cfe14bc4402d8de7e8610e51fa5b75bf7d2e6deeccbb4e23ef0d2"} Apr 24 21:28:38.668245 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:38.668214 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" event={"ID":"86168a3b-d0b9-42cf-8c50-cfdab95abfc9","Type":"ContainerStarted","Data":"a987d7f0c080777b8a67da2c45a4980602761a3f371f1eca3b41468169382aeb"} Apr 24 21:28:38.669706 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:38.669668 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7584f55f55-p88k5" event={"ID":"ad19df44-5b36-4c1c-a39f-a3779df1e10a","Type":"ContainerStarted","Data":"b007e6c343475b45988c9748602b7243c03e668f1db7639dae25cbf08ec1fe4e"} Apr 24 21:28:38.669706 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:38.669701 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7584f55f55-p88k5" event={"ID":"ad19df44-5b36-4c1c-a39f-a3779df1e10a","Type":"ContainerStarted","Data":"24ad622eb356845b658fa29dac9df7f22dce5a58d135f139c6557a67d8bcc0b8"} Apr 24 21:28:38.669865 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:38.669808 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:38.720112 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:38.720043 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7584f55f55-p88k5" podStartSLOduration=1.720023425 podStartE2EDuration="1.720023425s" podCreationTimestamp="2026-04-24 21:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:38.719843505 +0000 UTC m=+56.818256019" watchObservedRunningTime="2026-04-24 21:28:38.720023425 +0000 UTC m=+56.818435939" Apr 24 21:28:39.006102 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.006016 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh"] Apr 24 21:28:39.022897 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.022847 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh"] Apr 24 21:28:39.023049 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.022968 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh" Apr 24 21:28:39.026156 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.025728 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:28:39.026156 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.025975 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-24w6n\"" Apr 24 21:28:39.153865 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.153839 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c90817f2-8aeb-4327-b2a3-1d7f4c796dbb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-djlzh\" (UID: \"c90817f2-8aeb-4327-b2a3-1d7f4c796dbb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh" Apr 24 21:28:39.254785 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.254756 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c90817f2-8aeb-4327-b2a3-1d7f4c796dbb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-djlzh\" (UID: \"c90817f2-8aeb-4327-b2a3-1d7f4c796dbb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh" Apr 24 21:28:39.257504 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.257451 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c90817f2-8aeb-4327-b2a3-1d7f4c796dbb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-djlzh\" (UID: \"c90817f2-8aeb-4327-b2a3-1d7f4c796dbb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh" Apr 24 21:28:39.341182 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.341153 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh" Apr 24 21:28:39.495346 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.495316 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh"] Apr 24 21:28:39.498815 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:39.498787 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc90817f2_8aeb_4327_b2a3_1d7f4c796dbb.slice/crio-c2b711fb5f4de08e7228eb57e1d4ee51f8981a00fd541a32051bd668be333b45 WatchSource:0}: Error finding container c2b711fb5f4de08e7228eb57e1d4ee51f8981a00fd541a32051bd668be333b45: Status 404 returned error can't find the container with id c2b711fb5f4de08e7228eb57e1d4ee51f8981a00fd541a32051bd668be333b45 Apr 24 21:28:39.588438 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.588363 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mxnmf" Apr 24 21:28:39.674073 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.674040 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" event={"ID":"86168a3b-d0b9-42cf-8c50-cfdab95abfc9","Type":"ContainerStarted","Data":"0c136a62a67a9255adcf47945d9a4d3fd03e42318167dbc114be9f290a8ea550"} Apr 24 21:28:39.675401 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.675356 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh" event={"ID":"c90817f2-8aeb-4327-b2a3-1d7f4c796dbb","Type":"ContainerStarted","Data":"c2b711fb5f4de08e7228eb57e1d4ee51f8981a00fd541a32051bd668be333b45"} Apr 24 21:28:39.695709 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:39.695366 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7fp2" podStartSLOduration=1.194784111 podStartE2EDuration="2.695354195s" podCreationTimestamp="2026-04-24 21:28:37 +0000 UTC" firstStartedPulling="2026-04-24 21:28:37.73564136 +0000 UTC m=+55.834053851" lastFinishedPulling="2026-04-24 21:28:39.23621143 +0000 UTC m=+57.334623935" observedRunningTime="2026-04-24 21:28:39.695201928 +0000 UTC m=+57.793614452" watchObservedRunningTime="2026-04-24 21:28:39.695354195 +0000 UTC m=+57.793766708" Apr 24 21:28:41.681793 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:41.681752 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh" event={"ID":"c90817f2-8aeb-4327-b2a3-1d7f4c796dbb","Type":"ContainerStarted","Data":"3bef043551b0c13ddaaf9a9c34a7c567c5563756d7cbb5d0ef068f49396f4c3d"} Apr 24 21:28:41.682242 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:41.681966 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh" Apr 24 21:28:41.687593 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:41.687548 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh" Apr 24 21:28:41.705195 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:41.705147 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-djlzh" podStartSLOduration=2.491583228 podStartE2EDuration="3.705132585s" podCreationTimestamp="2026-04-24 21:28:38 +0000 UTC" firstStartedPulling="2026-04-24 21:28:39.500977193 +0000 UTC m=+57.599389684" lastFinishedPulling="2026-04-24 21:28:40.714526536 +0000 UTC m=+58.812939041" observedRunningTime="2026-04-24 21:28:41.703775491 +0000 UTC m=+59.802188016" watchObservedRunningTime="2026-04-24 21:28:41.705132585 +0000 UTC m=+59.803545099" Apr 24 21:28:42.092903 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.092781 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-569bb"] Apr 24 21:28:42.096790 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.096759 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:42.100727 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:42.100690 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-142-242.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-142-242.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 24 21:28:42.100867 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:42.100793 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"metrics-client-ca\" is forbidden: User \"system:node:ip-10-0-142-242.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-142-242.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" type="*v1.ConfigMap" Apr 24 21:28:42.100867 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:42.100794 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"prometheus-operator-kube-rbac-proxy-config\" is forbidden: User \"system:node:ip-10-0-142-242.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-142-242.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" type="*v1.Secret" Apr 24 21:28:42.101231 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.101210 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:28:42.101878 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.101787 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qnlbn\"" Apr 24 21:28:42.101878 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:42.101829 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"prometheus-operator-tls\" is forbidden: User \"system:node:ip-10-0-142-242.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-142-242.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" type="*v1.Secret" Apr 24 21:28:42.122547 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.122519 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-569bb"] Apr 24 21:28:42.182051 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.182014 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:42.182203 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.182090 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j6c8\" (UniqueName: \"kubernetes.io/projected/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-kube-api-access-5j6c8\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:42.182203 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.182163 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:42.182313 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.182224 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:42.283668 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.283634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:42.283825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.283707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:42.283825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.283745 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:42.283825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.283786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5j6c8\" (UniqueName: \"kubernetes.io/projected/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-kube-api-access-5j6c8\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:42.293966 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:42.293933 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:28:43.155660 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.155628 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 21:28:43.166655 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.166627 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:43.284084 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:43.284043 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Apr 24 21:28:43.284243 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:43.284133 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-prometheus-operator-tls podName:c762af5f-7dc9-4a77-ad83-ac3d3233b5d5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:43.784113311 +0000 UTC m=+61.882525807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-569bb" (UID: "c762af5f-7dc9-4a77-ad83-ac3d3233b5d5") : failed to sync secret cache: timed out waiting for the condition Apr 24 21:28:43.284243 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:43.284049 2567 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Apr 24 21:28:43.284243 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:43.284232 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-metrics-client-ca podName:c762af5f-7dc9-4a77-ad83-ac3d3233b5d5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:43.784217671 +0000 UTC m=+61.882630168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-metrics-client-ca") pod "prometheus-operator-5676c8c784-569bb" (UID: "c762af5f-7dc9-4a77-ad83-ac3d3233b5d5") : failed to sync configmap cache: timed out waiting for the condition Apr 24 21:28:43.302871 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:43.302837 2567 projected.go:289] Couldn't get configMap openshift-monitoring/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 24 21:28:43.302871 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:43.302863 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5j6c8 for pod openshift-monitoring/prometheus-operator-5676c8c784-569bb: failed to sync configmap cache: timed out waiting for the condition Apr 24 21:28:43.303083 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:28:43.302927 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-kube-api-access-5j6c8 podName:c762af5f-7dc9-4a77-ad83-ac3d3233b5d5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:43.802901743 +0000 UTC m=+61.901314240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5j6c8" (UniqueName: "kubernetes.io/projected/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-kube-api-access-5j6c8") pod "prometheus-operator-5676c8c784-569bb" (UID: "c762af5f-7dc9-4a77-ad83-ac3d3233b5d5") : failed to sync configmap cache: timed out waiting for the condition Apr 24 21:28:43.444427 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.444356 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:28:43.612596 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.612563 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 21:28:43.620799 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.620773 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:28:43.795640 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.795550 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:43.795800 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.795649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:43.796272 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.796240 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:43.798328 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.798303 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:43.896694 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.896657 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5j6c8\" (UniqueName: \"kubernetes.io/projected/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-kube-api-access-5j6c8\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:43.899322 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.899299 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j6c8\" (UniqueName: \"kubernetes.io/projected/c762af5f-7dc9-4a77-ad83-ac3d3233b5d5-kube-api-access-5j6c8\") pod \"prometheus-operator-5676c8c784-569bb\" (UID: \"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:43.911325 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.911295 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qnlbn\"" Apr 24 21:28:43.919345 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:43.919320 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" Apr 24 21:28:44.051203 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:44.051087 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-569bb"] Apr 24 21:28:44.053965 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:44.053934 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc762af5f_7dc9_4a77_ad83_ac3d3233b5d5.slice/crio-14e87bee15559b8aec30cf12c458647b148b3ccf44cf71eab4b0125f23e469de WatchSource:0}: Error finding container 14e87bee15559b8aec30cf12c458647b148b3ccf44cf71eab4b0125f23e469de: Status 404 returned error can't find the container with id 14e87bee15559b8aec30cf12c458647b148b3ccf44cf71eab4b0125f23e469de Apr 24 21:28:44.658536 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:44.658504 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jhpmf" Apr 24 21:28:44.692931 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:44.692660 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" event={"ID":"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5","Type":"ContainerStarted","Data":"14e87bee15559b8aec30cf12c458647b148b3ccf44cf71eab4b0125f23e469de"} Apr 24 21:28:45.697546 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:45.697483 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" event={"ID":"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5","Type":"ContainerStarted","Data":"b51f80f08b78e8d269460f69953d73ed0e256c458ea0bdce17545b40171b5dc0"} Apr 24 21:28:45.697546 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:45.697530 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" event={"ID":"c762af5f-7dc9-4a77-ad83-ac3d3233b5d5","Type":"ContainerStarted","Data":"5351c45b7c3204ca96e2cd7b99cb462ed9883afd107744add6de5567a980a552"} Apr 24 21:28:45.723432 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:45.723372 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-569bb" podStartSLOduration=2.509981236 podStartE2EDuration="3.723352462s" podCreationTimestamp="2026-04-24 21:28:42 +0000 UTC" firstStartedPulling="2026-04-24 21:28:44.056033597 +0000 UTC m=+62.154446094" lastFinishedPulling="2026-04-24 21:28:45.269404826 +0000 UTC m=+63.367817320" observedRunningTime="2026-04-24 21:28:45.72197431 +0000 UTC m=+63.820386823" watchObservedRunningTime="2026-04-24 21:28:45.723352462 +0000 UTC m=+63.821765006" Apr 24 21:28:47.370734 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.370650 2567 patch_prober.go:28] interesting pod/image-registry-79d7dbdd8d-zcbl7 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:28:47.371210 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.370714 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" podUID="d52bab75-adb9-452b-a696-d309c8c9d8c3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:28:47.570704 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.570673 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vtd9b"] Apr 24 21:28:47.605075 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.605042 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.607477 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.607450 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:28:47.607941 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.607903 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:28:47.608099 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.608083 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:28:47.608456 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.608397 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sgkz5\"" Apr 24 21:28:47.732600 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.732544 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.732780 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.732615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-tls\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.732780 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.732651 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmzq6\" (UniqueName: \"kubernetes.io/projected/079b286b-1432-4fa6-94b3-3b5066076fdf-kube-api-access-pmzq6\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.732780 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.732701 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-accelerators-collector-config\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.732780 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.732734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/079b286b-1432-4fa6-94b3-3b5066076fdf-metrics-client-ca\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.732993 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.732860 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/079b286b-1432-4fa6-94b3-3b5066076fdf-sys\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.732993 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.732913 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-textfile\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.732993 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.732940 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-wtmp\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.732993 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.732963 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/079b286b-1432-4fa6-94b3-3b5066076fdf-root\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.833697 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833664 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/079b286b-1432-4fa6-94b3-3b5066076fdf-sys\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.833871 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833721 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-textfile\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.833871 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833751 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-wtmp\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.833871 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833780 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/079b286b-1432-4fa6-94b3-3b5066076fdf-root\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.833871 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833780 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/079b286b-1432-4fa6-94b3-3b5066076fdf-sys\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.833871 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833814 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.833871 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833842 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-tls\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.833871 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833867 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmzq6\" (UniqueName: \"kubernetes.io/projected/079b286b-1432-4fa6-94b3-3b5066076fdf-kube-api-access-pmzq6\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.834153 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833917 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-accelerators-collector-config\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.834153 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833924 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-wtmp\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.834153 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833963 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/079b286b-1432-4fa6-94b3-3b5066076fdf-metrics-client-ca\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.834153 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.833985 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/079b286b-1432-4fa6-94b3-3b5066076fdf-root\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.834601 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.834529 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-accelerators-collector-config\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.834730 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.834710 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/079b286b-1432-4fa6-94b3-3b5066076fdf-metrics-client-ca\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.836867 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.836834 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-tls\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.843704 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.843657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmzq6\" (UniqueName: \"kubernetes.io/projected/079b286b-1432-4fa6-94b3-3b5066076fdf-kube-api-access-pmzq6\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.846197 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.846143 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-textfile\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.847961 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.847939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/079b286b-1432-4fa6-94b3-3b5066076fdf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vtd9b\" (UID: \"079b286b-1432-4fa6-94b3-3b5066076fdf\") " pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:47.916403 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:47.916373 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vtd9b" Apr 24 21:28:48.136543 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.136456 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:48.138816 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.138791 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:48.150288 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.150261 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52f8223b-f29e-4bac-bf1e-475d1a24a90c-metrics-certs\") pod \"network-metrics-daemon-x6s8x\" (UID: \"52f8223b-f29e-4bac-bf1e-475d1a24a90c\") " pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:48.235600 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.235550 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7528m\"" Apr 24 21:28:48.237239 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.237212 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtz2\" (UniqueName: \"kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2\") pod \"network-check-target-jjnzv\" (UID: \"63ad21c9-3529-456b-b932-b0cb7555c6a5\") " pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:48.239671 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.239652 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:48.243790 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.243774 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6s8x" Apr 24 21:28:48.249432 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.249414 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:48.260837 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.260814 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdtz2\" (UniqueName: \"kubernetes.io/projected/63ad21c9-3529-456b-b932-b0cb7555c6a5-kube-api-access-mdtz2\") pod \"network-check-target-jjnzv\" (UID: \"63ad21c9-3529-456b-b932-b0cb7555c6a5\") " pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:48.524988 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.524949 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-q7cg2\"" Apr 24 21:28:48.533305 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:48.533267 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:53.727562 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:53.727529 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod079b286b_1432_4fa6_94b3_3b5066076fdf.slice/crio-19b0bd2a4896c1fac28dc0ac4f8279133d5af944400e4b7c823c05efc541434b WatchSource:0}: Error finding container 19b0bd2a4896c1fac28dc0ac4f8279133d5af944400e4b7c823c05efc541434b: Status 404 returned error can't find the container with id 19b0bd2a4896c1fac28dc0ac4f8279133d5af944400e4b7c823c05efc541434b Apr 24 21:28:53.756384 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.756057 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68967bf444-dcl5n"] Apr 24 21:28:53.762160 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.761162 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.765740 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.764900 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-rq5kh\"" Apr 24 21:28:53.765740 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.765169 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:28:53.765740 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.765367 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:28:53.765740 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.765570 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:28:53.766244 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.766213 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:28:53.767896 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.767702 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:28:53.775900 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.774766 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:28:53.775900 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.774767 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68967bf444-dcl5n"] Apr 24 21:28:53.886205 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.886176 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jjnzv"] Apr 24 21:28:53.889052 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.889024 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-serving-cert\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.889170 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.889069 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-oauth-config\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.889170 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:53.889117 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ad21c9_3529_456b_b932_b0cb7555c6a5.slice/crio-8345684885a8b182c85be149b44544ee16a3c567826decbe762e5c66c048acfd WatchSource:0}: Error finding container 8345684885a8b182c85be149b44544ee16a3c567826decbe762e5c66c048acfd: Status 404 returned error can't find the container with id 8345684885a8b182c85be149b44544ee16a3c567826decbe762e5c66c048acfd Apr 24 21:28:53.889170 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.889160 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-service-ca\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.889323 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.889196 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtrkg\" (UniqueName: \"kubernetes.io/projected/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-kube-api-access-rtrkg\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.889323 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.889253 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-config\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.889323 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.889292 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-oauth-serving-cert\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.889460 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.889345 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-trusted-ca-bundle\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.900628 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.900593 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x6s8x"] Apr 24 21:28:53.905505 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:53.905478 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f8223b_f29e_4bac_bf1e_475d1a24a90c.slice/crio-d484dc2ef1fa29dfebc23723503f16e93c000a186cfe81b790ab0af57ac99066 WatchSource:0}: Error finding container d484dc2ef1fa29dfebc23723503f16e93c000a186cfe81b790ab0af57ac99066: Status 404 returned error can't find the container with id d484dc2ef1fa29dfebc23723503f16e93c000a186cfe81b790ab0af57ac99066 Apr 24 21:28:53.990398 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.990311 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-serving-cert\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.990398 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.990349 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-oauth-config\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.990398 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.990376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-service-ca\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.990398 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.990393 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtrkg\" (UniqueName: \"kubernetes.io/projected/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-kube-api-access-rtrkg\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.990720 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.990411 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-config\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.990720 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.990435 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-oauth-serving-cert\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.990720 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.990494 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-trusted-ca-bundle\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.991389 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.991231 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-service-ca\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.991389 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.991284 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-config\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.991626 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.991382 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-oauth-serving-cert\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.991705 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.991683 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-trusted-ca-bundle\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.993632 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.993148 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-serving-cert\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:53.993632 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:53.993599 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-oauth-config\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:54.002317 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.002294 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtrkg\" (UniqueName: \"kubernetes.io/projected/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-kube-api-access-rtrkg\") pod \"console-68967bf444-dcl5n\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:54.078350 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.078310 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:28:54.247221 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.247135 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68967bf444-dcl5n"] Apr 24 21:28:54.256467 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:28:54.255360 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f210ae2_4e30_4475_83b4_1abe14cf4b9b.slice/crio-502399fc85be018bd561217ff25eec73e17d994aaf390c627488ecefca374dde WatchSource:0}: Error finding container 502399fc85be018bd561217ff25eec73e17d994aaf390c627488ecefca374dde: Status 404 returned error can't find the container with id 502399fc85be018bd561217ff25eec73e17d994aaf390c627488ecefca374dde Apr 24 21:28:54.723982 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.723919 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jjnzv" event={"ID":"63ad21c9-3529-456b-b932-b0cb7555c6a5","Type":"ContainerStarted","Data":"8345684885a8b182c85be149b44544ee16a3c567826decbe762e5c66c048acfd"} Apr 24 21:28:54.726204 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.726153 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-c6p8l" event={"ID":"00358327-c969-49bb-a961-6bbb496280a7","Type":"ContainerStarted","Data":"1d6040ba91818abab565cada4e155a4241f46fe38cd9c2b5c2d57b245129abab"} Apr 24 21:28:54.727226 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.727177 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-c6p8l" Apr 24 21:28:54.729527 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.729502 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x6s8x" event={"ID":"52f8223b-f29e-4bac-bf1e-475d1a24a90c","Type":"ContainerStarted","Data":"d484dc2ef1fa29dfebc23723503f16e93c000a186cfe81b790ab0af57ac99066"} Apr 24 21:28:54.731361 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.731337 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68967bf444-dcl5n" event={"ID":"4f210ae2-4e30-4475-83b4-1abe14cf4b9b","Type":"ContainerStarted","Data":"502399fc85be018bd561217ff25eec73e17d994aaf390c627488ecefca374dde"} Apr 24 21:28:54.732646 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.732612 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vtd9b" event={"ID":"079b286b-1432-4fa6-94b3-3b5066076fdf","Type":"ContainerStarted","Data":"19b0bd2a4896c1fac28dc0ac4f8279133d5af944400e4b7c823c05efc541434b"} Apr 24 21:28:54.741755 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.741732 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-c6p8l" Apr 24 21:28:54.748158 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:54.748105 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-c6p8l" podStartSLOduration=1.7180380739999999 podStartE2EDuration="17.748089911s" podCreationTimestamp="2026-04-24 21:28:37 +0000 UTC" firstStartedPulling="2026-04-24 21:28:37.816440265 +0000 UTC m=+55.914852759" lastFinishedPulling="2026-04-24 21:28:53.846492101 +0000 UTC m=+71.944904596" observedRunningTime="2026-04-24 21:28:54.746771645 +0000 UTC m=+72.845184169" watchObservedRunningTime="2026-04-24 21:28:54.748089911 +0000 UTC m=+72.846502426" Apr 24 21:28:56.745660 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:56.745566 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x6s8x" event={"ID":"52f8223b-f29e-4bac-bf1e-475d1a24a90c","Type":"ContainerStarted","Data":"e0fe2219274dd17cebee3d6cc1ab685630292ae5b574dcdc402364bd7602f9ac"} Apr 24 21:28:56.745660 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:56.745635 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x6s8x" event={"ID":"52f8223b-f29e-4bac-bf1e-475d1a24a90c","Type":"ContainerStarted","Data":"60c39bec4bb33766889e8f396761ef047c5782eb048c18629d8e765ea5a16e0f"} Apr 24 21:28:56.749084 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:56.748092 2567 generic.go:358] "Generic (PLEG): container finished" podID="079b286b-1432-4fa6-94b3-3b5066076fdf" containerID="ee01a8df837b4570ae66f1142c61688d4455c7082707971485ea6e6180ec8ebc" exitCode=0 Apr 24 21:28:56.749232 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:56.749068 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vtd9b" event={"ID":"079b286b-1432-4fa6-94b3-3b5066076fdf","Type":"ContainerDied","Data":"ee01a8df837b4570ae66f1142c61688d4455c7082707971485ea6e6180ec8ebc"} Apr 24 21:28:56.764933 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:56.764881 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x6s8x" podStartSLOduration=73.019796178 podStartE2EDuration="1m14.764863372s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:28:53.90744272 +0000 UTC m=+72.005855211" lastFinishedPulling="2026-04-24 21:28:55.652509914 +0000 UTC m=+73.750922405" observedRunningTime="2026-04-24 21:28:56.763141273 +0000 UTC m=+74.861553786" watchObservedRunningTime="2026-04-24 21:28:56.764863372 +0000 UTC m=+74.863275889" Apr 24 21:28:57.369100 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:57.369070 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:28:59.680843 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:59.680818 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7584f55f55-p88k5" Apr 24 21:28:59.759965 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:59.759900 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68967bf444-dcl5n" event={"ID":"4f210ae2-4e30-4475-83b4-1abe14cf4b9b","Type":"ContainerStarted","Data":"7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f"} Apr 24 21:28:59.762836 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:59.762767 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vtd9b" event={"ID":"079b286b-1432-4fa6-94b3-3b5066076fdf","Type":"ContainerStarted","Data":"cd6cb2c69d8501dc79a601b8ba5ed04935ed63c950217f7016eb941dec42da15"} Apr 24 21:28:59.765122 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:59.764801 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jjnzv" event={"ID":"63ad21c9-3529-456b-b932-b0cb7555c6a5","Type":"ContainerStarted","Data":"3e42726c7c5f8bf7db0f8339e3583e4dba02ab5fa4898e7513f03db532f590c9"} Apr 24 21:28:59.765122 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:59.765044 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:28:59.791540 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:59.791481 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68967bf444-dcl5n" podStartSLOduration=1.480776194 podStartE2EDuration="6.791465371s" podCreationTimestamp="2026-04-24 21:28:53 +0000 UTC" firstStartedPulling="2026-04-24 21:28:54.260102746 +0000 UTC m=+72.358515237" lastFinishedPulling="2026-04-24 21:28:59.570791906 +0000 UTC m=+77.669204414" observedRunningTime="2026-04-24 21:28:59.791189876 +0000 UTC m=+77.889602385" watchObservedRunningTime="2026-04-24 21:28:59.791465371 +0000 UTC m=+77.889877884" Apr 24 21:28:59.811157 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:28:59.810905 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jjnzv" podStartSLOduration=72.138358404 podStartE2EDuration="1m17.810886361s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:28:53.891177808 +0000 UTC m=+71.989590300" lastFinishedPulling="2026-04-24 21:28:59.563705737 +0000 UTC m=+77.662118257" observedRunningTime="2026-04-24 21:28:59.809679632 +0000 UTC m=+77.908092146" watchObservedRunningTime="2026-04-24 21:28:59.810886361 +0000 UTC m=+77.909298875" Apr 24 21:29:00.771211 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:00.771156 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vtd9b" event={"ID":"079b286b-1432-4fa6-94b3-3b5066076fdf","Type":"ContainerStarted","Data":"61384c701f95081d5d76ae871bc1a928e4ecd01c35fb263b2c808870277ff4db"} Apr 24 21:29:02.382313 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.382139 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" podUID="d52bab75-adb9-452b-a696-d309c8c9d8c3" containerName="registry" containerID="cri-o://4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a" gracePeriod=30 Apr 24 21:29:02.641334 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.641265 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:29:02.670673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.670628 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vtd9b" podStartSLOduration=13.754689462 podStartE2EDuration="15.670610516s" podCreationTimestamp="2026-04-24 21:28:47 +0000 UTC" firstStartedPulling="2026-04-24 21:28:53.732470209 +0000 UTC m=+71.830882715" lastFinishedPulling="2026-04-24 21:28:55.648391273 +0000 UTC m=+73.746803769" observedRunningTime="2026-04-24 21:29:00.792536371 +0000 UTC m=+78.890948886" watchObservedRunningTime="2026-04-24 21:29:02.670610516 +0000 UTC m=+80.769023030" Apr 24 21:29:02.771013 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.770974 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-bound-sa-token\") pod \"d52bab75-adb9-452b-a696-d309c8c9d8c3\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " Apr 24 21:29:02.771013 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771021 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-trusted-ca\") pod \"d52bab75-adb9-452b-a696-d309c8c9d8c3\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " Apr 24 21:29:02.771262 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771075 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls\") pod \"d52bab75-adb9-452b-a696-d309c8c9d8c3\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " Apr 24 21:29:02.771262 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771107 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-certificates\") pod \"d52bab75-adb9-452b-a696-d309c8c9d8c3\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " Apr 24 21:29:02.771506 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771479 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d52bab75-adb9-452b-a696-d309c8c9d8c3" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:02.771605 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771477 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d52bab75-adb9-452b-a696-d309c8c9d8c3" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:02.771605 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771521 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-installation-pull-secrets\") pod \"d52bab75-adb9-452b-a696-d309c8c9d8c3\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " Apr 24 21:29:02.771605 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771575 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg4nr\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-kube-api-access-sg4nr\") pod \"d52bab75-adb9-452b-a696-d309c8c9d8c3\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " Apr 24 21:29:02.771780 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771634 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-image-registry-private-configuration\") pod \"d52bab75-adb9-452b-a696-d309c8c9d8c3\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " Apr 24 21:29:02.771780 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771658 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d52bab75-adb9-452b-a696-d309c8c9d8c3-ca-trust-extracted\") pod \"d52bab75-adb9-452b-a696-d309c8c9d8c3\" (UID: \"d52bab75-adb9-452b-a696-d309c8c9d8c3\") " Apr 24 21:29:02.771887 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771841 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-certificates\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:29:02.771887 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.771860 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d52bab75-adb9-452b-a696-d309c8c9d8c3-trusted-ca\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:29:02.774229 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.774186 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d52bab75-adb9-452b-a696-d309c8c9d8c3" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:02.774347 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.774266 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d52bab75-adb9-452b-a696-d309c8c9d8c3" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:02.774752 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.774572 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d52bab75-adb9-452b-a696-d309c8c9d8c3" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:02.774752 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.774633 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d52bab75-adb9-452b-a696-d309c8c9d8c3" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:02.774752 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.774703 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-kube-api-access-sg4nr" (OuterVolumeSpecName: "kube-api-access-sg4nr") pod "d52bab75-adb9-452b-a696-d309c8c9d8c3" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3"). InnerVolumeSpecName "kube-api-access-sg4nr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:02.779246 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.779086 2567 generic.go:358] "Generic (PLEG): container finished" podID="d52bab75-adb9-452b-a696-d309c8c9d8c3" containerID="4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a" exitCode=0 Apr 24 21:29:02.779246 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.779158 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" Apr 24 21:29:02.779454 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.779161 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" event={"ID":"d52bab75-adb9-452b-a696-d309c8c9d8c3","Type":"ContainerDied","Data":"4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a"} Apr 24 21:29:02.779454 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.779447 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79d7dbdd8d-zcbl7" event={"ID":"d52bab75-adb9-452b-a696-d309c8c9d8c3","Type":"ContainerDied","Data":"b652e7ae5b7fef3c30809bea07053586ecb8247673961834273586bebe19b0c6"} Apr 24 21:29:02.779515 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.779477 2567 scope.go:117] "RemoveContainer" containerID="4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a" Apr 24 21:29:02.783091 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.783051 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d52bab75-adb9-452b-a696-d309c8c9d8c3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d52bab75-adb9-452b-a696-d309c8c9d8c3" (UID: "d52bab75-adb9-452b-a696-d309c8c9d8c3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:02.793089 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.793065 2567 scope.go:117] "RemoveContainer" containerID="4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a" Apr 24 21:29:02.793403 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:29:02.793383 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a\": container with ID starting with 4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a not found: ID does not exist" containerID="4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a" Apr 24 21:29:02.793485 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.793410 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a"} err="failed to get container status \"4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a\": rpc error: code = NotFound desc = could not find container \"4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a\": container with ID starting with 4a80bd74057e0527b1a74c3e38551ea2a431faf1dba721cb60097f3c42e9d20a not found: ID does not exist" Apr 24 21:29:02.872895 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.872862 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-registry-tls\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:29:02.872895 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.872895 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-installation-pull-secrets\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:29:02.873080 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.872913 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sg4nr\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-kube-api-access-sg4nr\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:29:02.873080 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.872922 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d52bab75-adb9-452b-a696-d309c8c9d8c3-image-registry-private-configuration\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:29:02.873080 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.872932 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d52bab75-adb9-452b-a696-d309c8c9d8c3-ca-trust-extracted\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:29:02.873080 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:02.872944 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d52bab75-adb9-452b-a696-d309c8c9d8c3-bound-sa-token\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:29:03.105573 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:03.105540 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-79d7dbdd8d-zcbl7"] Apr 24 21:29:03.110222 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:03.110197 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-79d7dbdd8d-zcbl7"] Apr 24 21:29:04.078870 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:04.078838 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:29:04.079307 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:04.078882 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:29:04.083686 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:04.083665 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:29:04.420040 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:04.420008 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d52bab75-adb9-452b-a696-d309c8c9d8c3" path="/var/lib/kubelet/pods/d52bab75-adb9-452b-a696-d309c8c9d8c3/volumes" Apr 24 21:29:04.791395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:04.791319 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:29:30.774670 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:29:30.774642 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jjnzv" Apr 24 21:30:06.137451 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.137401 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85fb65bf9-8q75c"] Apr 24 21:30:06.137939 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.137803 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d52bab75-adb9-452b-a696-d309c8c9d8c3" containerName="registry" Apr 24 21:30:06.137939 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.137819 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52bab75-adb9-452b-a696-d309c8c9d8c3" containerName="registry" Apr 24 21:30:06.137939 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.137871 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d52bab75-adb9-452b-a696-d309c8c9d8c3" containerName="registry" Apr 24 21:30:06.140672 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.140656 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.152762 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.152738 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85fb65bf9-8q75c"] Apr 24 21:30:06.218885 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.218843 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-oauth-serving-cert\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.219090 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.218954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-service-ca\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.219090 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.218998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv59s\" (UniqueName: \"kubernetes.io/projected/31e99f0d-fac5-4299-9221-6832b591acbf-kube-api-access-fv59s\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.219090 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.219073 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-console-config\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.219246 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.219146 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-oauth-config\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.219246 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.219168 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-trusted-ca-bundle\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.219246 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.219222 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-serving-cert\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.320458 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.320428 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-oauth-config\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.320458 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.320460 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-trusted-ca-bundle\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.320745 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.320478 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-serving-cert\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.320745 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.320512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-oauth-serving-cert\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.320745 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.320665 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-service-ca\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.320745 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.320723 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fv59s\" (UniqueName: \"kubernetes.io/projected/31e99f0d-fac5-4299-9221-6832b591acbf-kube-api-access-fv59s\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.320954 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.320766 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-console-config\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.321226 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.321205 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-oauth-serving-cert\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.321395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.321369 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-trusted-ca-bundle\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.321496 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.321426 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-console-config\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.321496 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.321427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-service-ca\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.323026 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.323004 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-serving-cert\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.323129 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.323109 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-oauth-config\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.332224 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.332204 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv59s\" (UniqueName: \"kubernetes.io/projected/31e99f0d-fac5-4299-9221-6832b591acbf-kube-api-access-fv59s\") pod \"console-85fb65bf9-8q75c\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.450846 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.450817 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:06.581089 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.581065 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85fb65bf9-8q75c"] Apr 24 21:30:06.582926 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:30:06.582903 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31e99f0d_fac5_4299_9221_6832b591acbf.slice/crio-c86e3fa9dd188309917cc5e58ba28e62cd39d2b325871c204b6cc72e2fdf7757 WatchSource:0}: Error finding container c86e3fa9dd188309917cc5e58ba28e62cd39d2b325871c204b6cc72e2fdf7757: Status 404 returned error can't find the container with id c86e3fa9dd188309917cc5e58ba28e62cd39d2b325871c204b6cc72e2fdf7757 Apr 24 21:30:06.958725 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.958689 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85fb65bf9-8q75c" event={"ID":"31e99f0d-fac5-4299-9221-6832b591acbf","Type":"ContainerStarted","Data":"f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f"} Apr 24 21:30:06.958903 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:06.958728 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85fb65bf9-8q75c" event={"ID":"31e99f0d-fac5-4299-9221-6832b591acbf","Type":"ContainerStarted","Data":"c86e3fa9dd188309917cc5e58ba28e62cd39d2b325871c204b6cc72e2fdf7757"} Apr 24 21:30:16.451976 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:16.451943 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:16.452455 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:16.452035 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:16.456672 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:16.456651 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:16.478708 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:16.478660 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85fb65bf9-8q75c" podStartSLOduration=10.478649568 podStartE2EDuration="10.478649568s" podCreationTimestamp="2026-04-24 21:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:06.99318582 +0000 UTC m=+145.091598329" watchObservedRunningTime="2026-04-24 21:30:16.478649568 +0000 UTC m=+154.577062085" Apr 24 21:30:16.989037 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:16.989008 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:30:17.076145 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:17.076118 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68967bf444-dcl5n"] Apr 24 21:30:42.094882 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.094771 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68967bf444-dcl5n" podUID="4f210ae2-4e30-4475-83b4-1abe14cf4b9b" containerName="console" containerID="cri-o://7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f" gracePeriod=15 Apr 24 21:30:42.329106 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.329084 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68967bf444-dcl5n_4f210ae2-4e30-4475-83b4-1abe14cf4b9b/console/0.log" Apr 24 21:30:42.329214 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.329141 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:30:42.501492 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.501460 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtrkg\" (UniqueName: \"kubernetes.io/projected/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-kube-api-access-rtrkg\") pod \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " Apr 24 21:30:42.501682 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.501521 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-oauth-config\") pod \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " Apr 24 21:30:42.501682 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.501649 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-config\") pod \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " Apr 24 21:30:42.501682 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.501679 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-oauth-serving-cert\") pod \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " Apr 24 21:30:42.501844 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.501721 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-service-ca\") pod \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " Apr 24 21:30:42.501844 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.501756 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-serving-cert\") pod \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " Apr 24 21:30:42.501844 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.501815 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-trusted-ca-bundle\") pod \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\" (UID: \"4f210ae2-4e30-4475-83b4-1abe14cf4b9b\") " Apr 24 21:30:42.502435 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.502406 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4f210ae2-4e30-4475-83b4-1abe14cf4b9b" (UID: "4f210ae2-4e30-4475-83b4-1abe14cf4b9b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:42.505020 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.504991 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-kube-api-access-rtrkg" (OuterVolumeSpecName: "kube-api-access-rtrkg") pod "4f210ae2-4e30-4475-83b4-1abe14cf4b9b" (UID: "4f210ae2-4e30-4475-83b4-1abe14cf4b9b"). InnerVolumeSpecName "kube-api-access-rtrkg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:42.505730 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.505703 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-config" (OuterVolumeSpecName: "console-config") pod "4f210ae2-4e30-4475-83b4-1abe14cf4b9b" (UID: "4f210ae2-4e30-4475-83b4-1abe14cf4b9b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:42.505730 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.505716 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-service-ca" (OuterVolumeSpecName: "service-ca") pod "4f210ae2-4e30-4475-83b4-1abe14cf4b9b" (UID: "4f210ae2-4e30-4475-83b4-1abe14cf4b9b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:42.505881 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.505701 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4f210ae2-4e30-4475-83b4-1abe14cf4b9b" (UID: "4f210ae2-4e30-4475-83b4-1abe14cf4b9b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:42.507130 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.507108 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4f210ae2-4e30-4475-83b4-1abe14cf4b9b" (UID: "4f210ae2-4e30-4475-83b4-1abe14cf4b9b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:42.507434 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.507413 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4f210ae2-4e30-4475-83b4-1abe14cf4b9b" (UID: "4f210ae2-4e30-4475-83b4-1abe14cf4b9b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:42.602421 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.602381 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-config\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:30:42.602421 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.602415 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-oauth-serving-cert\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:30:42.602421 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.602426 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-service-ca\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:30:42.602421 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.602435 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-serving-cert\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:30:42.602738 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.602444 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-trusted-ca-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:30:42.602738 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.602453 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtrkg\" (UniqueName: \"kubernetes.io/projected/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-kube-api-access-rtrkg\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:30:42.602738 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:42.602461 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f210ae2-4e30-4475-83b4-1abe14cf4b9b-console-oauth-config\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:30:43.055066 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:43.055035 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68967bf444-dcl5n_4f210ae2-4e30-4475-83b4-1abe14cf4b9b/console/0.log" Apr 24 21:30:43.055251 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:43.055080 2567 generic.go:358] "Generic (PLEG): container finished" podID="4f210ae2-4e30-4475-83b4-1abe14cf4b9b" containerID="7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f" exitCode=2 Apr 24 21:30:43.055251 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:43.055114 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68967bf444-dcl5n" event={"ID":"4f210ae2-4e30-4475-83b4-1abe14cf4b9b","Type":"ContainerDied","Data":"7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f"} Apr 24 21:30:43.055251 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:43.055161 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68967bf444-dcl5n" event={"ID":"4f210ae2-4e30-4475-83b4-1abe14cf4b9b","Type":"ContainerDied","Data":"502399fc85be018bd561217ff25eec73e17d994aaf390c627488ecefca374dde"} Apr 24 21:30:43.055251 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:43.055169 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68967bf444-dcl5n" Apr 24 21:30:43.055251 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:43.055181 2567 scope.go:117] "RemoveContainer" containerID="7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f" Apr 24 21:30:43.062978 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:43.062952 2567 scope.go:117] "RemoveContainer" containerID="7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f" Apr 24 21:30:43.063226 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:30:43.063205 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f\": container with ID starting with 7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f not found: ID does not exist" containerID="7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f" Apr 24 21:30:43.063315 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:43.063231 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f"} err="failed to get container status \"7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f\": rpc error: code = NotFound desc = could not find container \"7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f\": container with ID starting with 7e4f4d3a25b068df6134ed8e0769da32b55cf30103388b96d7456e898e8a172f not found: ID does not exist" Apr 24 21:30:43.082942 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:43.082913 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68967bf444-dcl5n"] Apr 24 21:30:43.085122 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:43.085100 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68967bf444-dcl5n"] Apr 24 21:30:44.416390 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:30:44.416354 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f210ae2-4e30-4475-83b4-1abe14cf4b9b" path="/var/lib/kubelet/pods/4f210ae2-4e30-4475-83b4-1abe14cf4b9b/volumes" Apr 24 21:31:33.598848 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.598814 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-854bc64795-4jdkf"] Apr 24 21:31:33.599277 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.599090 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f210ae2-4e30-4475-83b4-1abe14cf4b9b" containerName="console" Apr 24 21:31:33.599277 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.599101 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f210ae2-4e30-4475-83b4-1abe14cf4b9b" containerName="console" Apr 24 21:31:33.599277 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.599140 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f210ae2-4e30-4475-83b4-1abe14cf4b9b" containerName="console" Apr 24 21:31:33.602001 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.601985 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.619725 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.619697 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-854bc64795-4jdkf"] Apr 24 21:31:33.663596 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.663560 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-serving-cert\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.663765 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.663613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-oauth-config\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.663765 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.663683 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-oauth-serving-cert\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.663765 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.663729 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-service-ca\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.663765 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.663757 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdszh\" (UniqueName: \"kubernetes.io/projected/b0d3d358-1cc0-4830-baca-df4d99001501-kube-api-access-cdszh\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.663950 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.663803 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-console-config\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.663950 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.663832 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-trusted-ca-bundle\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.764597 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.764554 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-service-ca\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.764757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.764626 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdszh\" (UniqueName: \"kubernetes.io/projected/b0d3d358-1cc0-4830-baca-df4d99001501-kube-api-access-cdszh\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.764757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.764659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-console-config\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.764757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.764683 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-trusted-ca-bundle\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.764757 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.764744 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-serving-cert\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.764948 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.764864 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-oauth-config\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.764948 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.764916 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-oauth-serving-cert\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.765426 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.765398 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-service-ca\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.765574 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.765436 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-console-config\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.765673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.765652 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-oauth-serving-cert\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.766021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.766001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-trusted-ca-bundle\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.767272 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.767252 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-oauth-config\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.767272 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.767261 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-serving-cert\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.774655 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.774630 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdszh\" (UniqueName: \"kubernetes.io/projected/b0d3d358-1cc0-4830-baca-df4d99001501-kube-api-access-cdszh\") pod \"console-854bc64795-4jdkf\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:33.910643 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:33.910613 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:34.036715 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:34.036688 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-854bc64795-4jdkf"] Apr 24 21:31:34.039281 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:31:34.039248 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d3d358_1cc0_4830_baca_df4d99001501.slice/crio-57ccfd22a49c7a95b4eabefca8a2da7152b76f704a5409b76a4389d2a8e33148 WatchSource:0}: Error finding container 57ccfd22a49c7a95b4eabefca8a2da7152b76f704a5409b76a4389d2a8e33148: Status 404 returned error can't find the container with id 57ccfd22a49c7a95b4eabefca8a2da7152b76f704a5409b76a4389d2a8e33148 Apr 24 21:31:34.196761 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:34.196687 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-854bc64795-4jdkf" event={"ID":"b0d3d358-1cc0-4830-baca-df4d99001501","Type":"ContainerStarted","Data":"f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87"} Apr 24 21:31:34.196761 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:34.196725 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-854bc64795-4jdkf" event={"ID":"b0d3d358-1cc0-4830-baca-df4d99001501","Type":"ContainerStarted","Data":"57ccfd22a49c7a95b4eabefca8a2da7152b76f704a5409b76a4389d2a8e33148"} Apr 24 21:31:34.217957 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:34.217919 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-854bc64795-4jdkf" podStartSLOduration=1.21790514 podStartE2EDuration="1.21790514s" podCreationTimestamp="2026-04-24 21:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:34.217534464 +0000 UTC m=+232.315946974" watchObservedRunningTime="2026-04-24 21:31:34.21790514 +0000 UTC m=+232.316317652" Apr 24 21:31:43.911007 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:43.910960 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:43.911007 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:43.911013 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:43.915811 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:43.915791 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:44.227638 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:44.227549 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:31:44.282872 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:31:44.282843 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85fb65bf9-8q75c"] Apr 24 21:32:09.302152 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.302045 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85fb65bf9-8q75c" podUID="31e99f0d-fac5-4299-9221-6832b591acbf" containerName="console" containerID="cri-o://f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f" gracePeriod=15 Apr 24 21:32:09.534205 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.534184 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85fb65bf9-8q75c_31e99f0d-fac5-4299-9221-6832b591acbf/console/0.log" Apr 24 21:32:09.534325 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.534248 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:32:09.542078 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.542060 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-oauth-config\") pod \"31e99f0d-fac5-4299-9221-6832b591acbf\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " Apr 24 21:32:09.542196 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.542177 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-oauth-serving-cert\") pod \"31e99f0d-fac5-4299-9221-6832b591acbf\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " Apr 24 21:32:09.542233 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.542208 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-service-ca\") pod \"31e99f0d-fac5-4299-9221-6832b591acbf\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " Apr 24 21:32:09.542233 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.542226 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv59s\" (UniqueName: \"kubernetes.io/projected/31e99f0d-fac5-4299-9221-6832b591acbf-kube-api-access-fv59s\") pod \"31e99f0d-fac5-4299-9221-6832b591acbf\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " Apr 24 21:32:09.542299 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.542257 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-serving-cert\") pod \"31e99f0d-fac5-4299-9221-6832b591acbf\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " Apr 24 21:32:09.542299 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.542291 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-console-config\") pod \"31e99f0d-fac5-4299-9221-6832b591acbf\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " Apr 24 21:32:09.542683 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.542647 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-service-ca" (OuterVolumeSpecName: "service-ca") pod "31e99f0d-fac5-4299-9221-6832b591acbf" (UID: "31e99f0d-fac5-4299-9221-6832b591acbf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:09.542683 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.542670 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "31e99f0d-fac5-4299-9221-6832b591acbf" (UID: "31e99f0d-fac5-4299-9221-6832b591acbf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:09.542847 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.542705 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-console-config" (OuterVolumeSpecName: "console-config") pod "31e99f0d-fac5-4299-9221-6832b591acbf" (UID: "31e99f0d-fac5-4299-9221-6832b591acbf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:09.544281 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.544259 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "31e99f0d-fac5-4299-9221-6832b591acbf" (UID: "31e99f0d-fac5-4299-9221-6832b591acbf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:32:09.544368 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.544275 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "31e99f0d-fac5-4299-9221-6832b591acbf" (UID: "31e99f0d-fac5-4299-9221-6832b591acbf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:32:09.544368 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.544326 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e99f0d-fac5-4299-9221-6832b591acbf-kube-api-access-fv59s" (OuterVolumeSpecName: "kube-api-access-fv59s") pod "31e99f0d-fac5-4299-9221-6832b591acbf" (UID: "31e99f0d-fac5-4299-9221-6832b591acbf"). InnerVolumeSpecName "kube-api-access-fv59s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:09.643223 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.643132 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-trusted-ca-bundle\") pod \"31e99f0d-fac5-4299-9221-6832b591acbf\" (UID: \"31e99f0d-fac5-4299-9221-6832b591acbf\") " Apr 24 21:32:09.643371 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.643268 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-oauth-serving-cert\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.643371 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.643279 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-service-ca\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.643371 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.643288 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fv59s\" (UniqueName: \"kubernetes.io/projected/31e99f0d-fac5-4299-9221-6832b591acbf-kube-api-access-fv59s\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.643371 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.643298 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-serving-cert\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.643371 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.643307 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-console-config\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.643371 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.643315 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31e99f0d-fac5-4299-9221-6832b591acbf-console-oauth-config\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.643556 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.643505 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "31e99f0d-fac5-4299-9221-6832b591acbf" (UID: "31e99f0d-fac5-4299-9221-6832b591acbf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:09.744250 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:09.744215 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31e99f0d-fac5-4299-9221-6832b591acbf-trusted-ca-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:32:10.290663 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.290636 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85fb65bf9-8q75c_31e99f0d-fac5-4299-9221-6832b591acbf/console/0.log" Apr 24 21:32:10.290828 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.290676 2567 generic.go:358] "Generic (PLEG): container finished" podID="31e99f0d-fac5-4299-9221-6832b591acbf" containerID="f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f" exitCode=2 Apr 24 21:32:10.290828 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.290711 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85fb65bf9-8q75c" event={"ID":"31e99f0d-fac5-4299-9221-6832b591acbf","Type":"ContainerDied","Data":"f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f"} Apr 24 21:32:10.290828 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.290741 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85fb65bf9-8q75c" Apr 24 21:32:10.290828 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.290753 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85fb65bf9-8q75c" event={"ID":"31e99f0d-fac5-4299-9221-6832b591acbf","Type":"ContainerDied","Data":"c86e3fa9dd188309917cc5e58ba28e62cd39d2b325871c204b6cc72e2fdf7757"} Apr 24 21:32:10.290828 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.290769 2567 scope.go:117] "RemoveContainer" containerID="f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f" Apr 24 21:32:10.299139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.299121 2567 scope.go:117] "RemoveContainer" containerID="f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f" Apr 24 21:32:10.299358 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:10.299339 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f\": container with ID starting with f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f not found: ID does not exist" containerID="f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f" Apr 24 21:32:10.299406 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.299364 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f"} err="failed to get container status \"f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f\": rpc error: code = NotFound desc = could not find container \"f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f\": container with ID starting with f498e57a538c85f30b4b5862e52df58876a2c66266d7c638d587f6c151aba26f not found: ID does not exist" Apr 24 21:32:10.317981 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.317955 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85fb65bf9-8q75c"] Apr 24 21:32:10.323815 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.323796 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85fb65bf9-8q75c"] Apr 24 21:32:10.416009 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:10.415976 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e99f0d-fac5-4299-9221-6832b591acbf" path="/var/lib/kubelet/pods/31e99f0d-fac5-4299-9221-6832b591acbf/volumes" Apr 24 21:32:27.481179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.481146 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58"] Apr 24 21:32:27.481667 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.481402 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31e99f0d-fac5-4299-9221-6832b591acbf" containerName="console" Apr 24 21:32:27.481667 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.481411 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e99f0d-fac5-4299-9221-6832b591acbf" containerName="console" Apr 24 21:32:27.481667 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.481454 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="31e99f0d-fac5-4299-9221-6832b591acbf" containerName="console" Apr 24 21:32:27.485662 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.485644 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.492261 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.492241 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:32:27.492367 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.492249 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:32:27.493101 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.493083 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-58qm5\"" Apr 24 21:32:27.504371 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.504345 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58"] Apr 24 21:32:27.564990 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.564962 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.564990 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.564998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzkmz\" (UniqueName: \"kubernetes.io/projected/0367a507-415d-4e6a-9076-bcfef3b49e00-kube-api-access-hzkmz\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.565216 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.565099 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.666025 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.665980 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.666210 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.666036 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzkmz\" (UniqueName: \"kubernetes.io/projected/0367a507-415d-4e6a-9076-bcfef3b49e00-kube-api-access-hzkmz\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.666210 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.666077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.666408 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.666384 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.666473 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.666419 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.685925 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.685900 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzkmz\" (UniqueName: \"kubernetes.io/projected/0367a507-415d-4e6a-9076-bcfef3b49e00-kube-api-access-hzkmz\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.794064 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.793985 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:27.924611 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:27.924560 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58"] Apr 24 21:32:27.928017 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:32:27.927989 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0367a507_415d_4e6a_9076_bcfef3b49e00.slice/crio-1db3783ddd49126d34e1b7285432d50a74d3ad0f54efd5057f73f6d433f12f40 WatchSource:0}: Error finding container 1db3783ddd49126d34e1b7285432d50a74d3ad0f54efd5057f73f6d433f12f40: Status 404 returned error can't find the container with id 1db3783ddd49126d34e1b7285432d50a74d3ad0f54efd5057f73f6d433f12f40 Apr 24 21:32:28.339294 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:28.339262 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" event={"ID":"0367a507-415d-4e6a-9076-bcfef3b49e00","Type":"ContainerStarted","Data":"1db3783ddd49126d34e1b7285432d50a74d3ad0f54efd5057f73f6d433f12f40"} Apr 24 21:32:33.353740 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:33.353699 2567 generic.go:358] "Generic (PLEG): container finished" podID="0367a507-415d-4e6a-9076-bcfef3b49e00" containerID="1b1ef8c90ef794948b8175dcf31c79d35203dc666e240b95671f4e2180db2f98" exitCode=0 Apr 24 21:32:33.354137 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:33.353748 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" event={"ID":"0367a507-415d-4e6a-9076-bcfef3b49e00","Type":"ContainerDied","Data":"1b1ef8c90ef794948b8175dcf31c79d35203dc666e240b95671f4e2180db2f98"} Apr 24 21:32:35.360995 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:35.360960 2567 generic.go:358] "Generic (PLEG): container finished" podID="0367a507-415d-4e6a-9076-bcfef3b49e00" containerID="1d6bbdfb0fc577cde6c3865e11a1438aa81c9282b75cf2162df0c7385ed1ea57" exitCode=0 Apr 24 21:32:35.361488 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:35.361056 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" event={"ID":"0367a507-415d-4e6a-9076-bcfef3b49e00","Type":"ContainerDied","Data":"1d6bbdfb0fc577cde6c3865e11a1438aa81c9282b75cf2162df0c7385ed1ea57"} Apr 24 21:32:41.380805 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:41.380769 2567 generic.go:358] "Generic (PLEG): container finished" podID="0367a507-415d-4e6a-9076-bcfef3b49e00" containerID="396cb5d580d93032e94a1b99fabdd0b7a79ba7bfe70ed61e23721ac06b58d6e1" exitCode=0 Apr 24 21:32:41.381202 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:41.380851 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" event={"ID":"0367a507-415d-4e6a-9076-bcfef3b49e00","Type":"ContainerDied","Data":"396cb5d580d93032e94a1b99fabdd0b7a79ba7bfe70ed61e23721ac06b58d6e1"} Apr 24 21:32:42.309030 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.309003 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:32:42.506104 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.506081 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:42.588577 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.588489 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzkmz\" (UniqueName: \"kubernetes.io/projected/0367a507-415d-4e6a-9076-bcfef3b49e00-kube-api-access-hzkmz\") pod \"0367a507-415d-4e6a-9076-bcfef3b49e00\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " Apr 24 21:32:42.588577 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.588538 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-util\") pod \"0367a507-415d-4e6a-9076-bcfef3b49e00\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " Apr 24 21:32:42.588825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.588679 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-bundle\") pod \"0367a507-415d-4e6a-9076-bcfef3b49e00\" (UID: \"0367a507-415d-4e6a-9076-bcfef3b49e00\") " Apr 24 21:32:42.589176 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.589153 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-bundle" (OuterVolumeSpecName: "bundle") pod "0367a507-415d-4e6a-9076-bcfef3b49e00" (UID: "0367a507-415d-4e6a-9076-bcfef3b49e00"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:42.590777 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.590754 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0367a507-415d-4e6a-9076-bcfef3b49e00-kube-api-access-hzkmz" (OuterVolumeSpecName: "kube-api-access-hzkmz") pod "0367a507-415d-4e6a-9076-bcfef3b49e00" (UID: "0367a507-415d-4e6a-9076-bcfef3b49e00"). InnerVolumeSpecName "kube-api-access-hzkmz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:42.592870 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.592848 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-util" (OuterVolumeSpecName: "util") pod "0367a507-415d-4e6a-9076-bcfef3b49e00" (UID: "0367a507-415d-4e6a-9076-bcfef3b49e00"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:42.689209 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.689163 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:32:42.689209 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.689205 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hzkmz\" (UniqueName: \"kubernetes.io/projected/0367a507-415d-4e6a-9076-bcfef3b49e00-kube-api-access-hzkmz\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:32:42.689209 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:42.689216 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0367a507-415d-4e6a-9076-bcfef3b49e00-util\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:32:43.387086 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:43.387046 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" event={"ID":"0367a507-415d-4e6a-9076-bcfef3b49e00","Type":"ContainerDied","Data":"1db3783ddd49126d34e1b7285432d50a74d3ad0f54efd5057f73f6d433f12f40"} Apr 24 21:32:43.387086 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:43.387091 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1db3783ddd49126d34e1b7285432d50a74d3ad0f54efd5057f73f6d433f12f40" Apr 24 21:32:43.387292 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:43.387064 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvcb58" Apr 24 21:32:49.604011 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.603980 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b"] Apr 24 21:32:49.604380 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.604232 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0367a507-415d-4e6a-9076-bcfef3b49e00" containerName="util" Apr 24 21:32:49.604380 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.604243 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0367a507-415d-4e6a-9076-bcfef3b49e00" containerName="util" Apr 24 21:32:49.604380 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.604262 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0367a507-415d-4e6a-9076-bcfef3b49e00" containerName="pull" Apr 24 21:32:49.604380 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.604267 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0367a507-415d-4e6a-9076-bcfef3b49e00" containerName="pull" Apr 24 21:32:49.604380 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.604274 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0367a507-415d-4e6a-9076-bcfef3b49e00" containerName="extract" Apr 24 21:32:49.604380 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.604279 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0367a507-415d-4e6a-9076-bcfef3b49e00" containerName="extract" Apr 24 21:32:49.604380 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.604322 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0367a507-415d-4e6a-9076-bcfef3b49e00" containerName="extract" Apr 24 21:32:49.656828 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.656785 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b"] Apr 24 21:32:49.656986 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.656911 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" Apr 24 21:32:49.659272 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.659244 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:32:49.659423 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.659320 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:32:49.659423 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.659349 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-qdr55\"" Apr 24 21:32:49.659423 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.659362 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:32:49.742030 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.741988 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8bcf0dd6-bb30-4b58-b765-9771a4489570-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b\" (UID: \"8bcf0dd6-bb30-4b58-b765-9771a4489570\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" Apr 24 21:32:49.742217 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.742057 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67v8q\" (UniqueName: \"kubernetes.io/projected/8bcf0dd6-bb30-4b58-b765-9771a4489570-kube-api-access-67v8q\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b\" (UID: \"8bcf0dd6-bb30-4b58-b765-9771a4489570\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" Apr 24 21:32:49.843432 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.843397 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8bcf0dd6-bb30-4b58-b765-9771a4489570-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b\" (UID: \"8bcf0dd6-bb30-4b58-b765-9771a4489570\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" Apr 24 21:32:49.843618 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.843567 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67v8q\" (UniqueName: \"kubernetes.io/projected/8bcf0dd6-bb30-4b58-b765-9771a4489570-kube-api-access-67v8q\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b\" (UID: \"8bcf0dd6-bb30-4b58-b765-9771a4489570\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" Apr 24 21:32:49.845736 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.845717 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8bcf0dd6-bb30-4b58-b765-9771a4489570-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b\" (UID: \"8bcf0dd6-bb30-4b58-b765-9771a4489570\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" Apr 24 21:32:49.852452 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.852432 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67v8q\" (UniqueName: \"kubernetes.io/projected/8bcf0dd6-bb30-4b58-b765-9771a4489570-kube-api-access-67v8q\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b\" (UID: \"8bcf0dd6-bb30-4b58-b765-9771a4489570\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" Apr 24 21:32:49.971591 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:49.971556 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" Apr 24 21:32:50.090950 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:50.090894 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b"] Apr 24 21:32:50.097120 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:32:50.097090 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bcf0dd6_bb30_4b58_b765_9771a4489570.slice/crio-60b4003c7a53dc95ebfdee477186efaea2d515e6082851944d1a050c0147ac1c WatchSource:0}: Error finding container 60b4003c7a53dc95ebfdee477186efaea2d515e6082851944d1a050c0147ac1c: Status 404 returned error can't find the container with id 60b4003c7a53dc95ebfdee477186efaea2d515e6082851944d1a050c0147ac1c Apr 24 21:32:50.098883 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:50.098862 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:32:50.406030 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:50.405996 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" event={"ID":"8bcf0dd6-bb30-4b58-b765-9771a4489570","Type":"ContainerStarted","Data":"60b4003c7a53dc95ebfdee477186efaea2d515e6082851944d1a050c0147ac1c"} Apr 24 21:32:53.416275 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.416237 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" event={"ID":"8bcf0dd6-bb30-4b58-b765-9771a4489570","Type":"ContainerStarted","Data":"e190835ff26b0869970dfe2d6100b6a1d37b22b767522a5f64eac5a7a522a369"} Apr 24 21:32:53.416725 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.416370 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" Apr 24 21:32:53.439054 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.439002 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" podStartSLOduration=1.2801705829999999 podStartE2EDuration="4.438988927s" podCreationTimestamp="2026-04-24 21:32:49 +0000 UTC" firstStartedPulling="2026-04-24 21:32:50.099012251 +0000 UTC m=+308.197424745" lastFinishedPulling="2026-04-24 21:32:53.257830582 +0000 UTC m=+311.356243089" observedRunningTime="2026-04-24 21:32:53.437067931 +0000 UTC m=+311.535480454" watchObservedRunningTime="2026-04-24 21:32:53.438988927 +0000 UTC m=+311.537401439" Apr 24 21:32:53.858203 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.858162 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-p8cdh"] Apr 24 21:32:53.861775 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.861753 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:53.864537 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.864514 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:32:53.864537 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.864520 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-zht5c\"" Apr 24 21:32:53.864716 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.864516 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:32:53.871237 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.871216 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-p8cdh"] Apr 24 21:32:53.973048 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.973019 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:53.973207 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.973062 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-cabundle0\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:53.973207 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:53.973132 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kws4g\" (UniqueName: \"kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-kube-api-access-kws4g\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:54.074417 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.074383 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kws4g\" (UniqueName: \"kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-kube-api-access-kws4g\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:54.074574 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.074436 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:54.074574 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.074480 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-cabundle0\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:54.074574 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.074555 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:32:54.074700 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.074575 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:32:54.074700 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.074605 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-p8cdh: references non-existent secret key: ca.crt Apr 24 21:32:54.074700 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.074665 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates podName:30213c3b-8f1b-4f0a-98d9-6cf15887e70d nodeName:}" failed. No retries permitted until 2026-04-24 21:32:54.574647524 +0000 UTC m=+312.673060017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates") pod "keda-operator-ffbb595cb-p8cdh" (UID: "30213c3b-8f1b-4f0a-98d9-6cf15887e70d") : references non-existent secret key: ca.crt Apr 24 21:32:54.075246 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.075227 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-cabundle0\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:54.081811 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.081788 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl"] Apr 24 21:32:54.084890 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.084873 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:54.086316 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.086295 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kws4g\" (UniqueName: \"kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-kube-api-access-kws4g\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:54.087182 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.087159 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:32:54.093155 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.093127 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl"] Apr 24 21:32:54.175241 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.175206 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/57fac6da-40a1-403b-85dc-2a59c73b86a9-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:54.175432 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.175271 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvtmq\" (UniqueName: \"kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-kube-api-access-gvtmq\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:54.175432 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.175343 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:54.275952 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.275919 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/57fac6da-40a1-403b-85dc-2a59c73b86a9-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:54.276127 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.275979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvtmq\" (UniqueName: \"kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-kube-api-access-gvtmq\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:54.276127 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.276002 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:54.276127 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.276090 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:32:54.276127 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.276101 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:32:54.276127 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.276117 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl: references non-existent secret key: tls.crt Apr 24 21:32:54.276309 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.276163 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates podName:57fac6da-40a1-403b-85dc-2a59c73b86a9 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:54.776149831 +0000 UTC m=+312.874562322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates") pod "keda-metrics-apiserver-7c9f485588-lnrsl" (UID: "57fac6da-40a1-403b-85dc-2a59c73b86a9") : references non-existent secret key: tls.crt Apr 24 21:32:54.276309 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.276291 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/57fac6da-40a1-403b-85dc-2a59c73b86a9-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:54.286118 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.286092 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvtmq\" (UniqueName: \"kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-kube-api-access-gvtmq\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:54.395679 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.395590 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-t7n7v"] Apr 24 21:32:54.398902 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.398879 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-t7n7v" Apr 24 21:32:54.402728 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.402703 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:32:54.412269 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.412244 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-t7n7v"] Apr 24 21:32:54.477456 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.477421 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6ca28e08-42c1-44b3-8fe8-030c8238b887-certificates\") pod \"keda-admission-cf49989db-t7n7v\" (UID: \"6ca28e08-42c1-44b3-8fe8-030c8238b887\") " pod="openshift-keda/keda-admission-cf49989db-t7n7v" Apr 24 21:32:54.477828 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.477489 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkbzv\" (UniqueName: \"kubernetes.io/projected/6ca28e08-42c1-44b3-8fe8-030c8238b887-kube-api-access-vkbzv\") pod \"keda-admission-cf49989db-t7n7v\" (UID: \"6ca28e08-42c1-44b3-8fe8-030c8238b887\") " pod="openshift-keda/keda-admission-cf49989db-t7n7v" Apr 24 21:32:54.578124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.578085 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkbzv\" (UniqueName: \"kubernetes.io/projected/6ca28e08-42c1-44b3-8fe8-030c8238b887-kube-api-access-vkbzv\") pod \"keda-admission-cf49989db-t7n7v\" (UID: \"6ca28e08-42c1-44b3-8fe8-030c8238b887\") " pod="openshift-keda/keda-admission-cf49989db-t7n7v" Apr 24 21:32:54.578304 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.578145 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:54.578304 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.578169 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6ca28e08-42c1-44b3-8fe8-030c8238b887-certificates\") pod \"keda-admission-cf49989db-t7n7v\" (UID: \"6ca28e08-42c1-44b3-8fe8-030c8238b887\") " pod="openshift-keda/keda-admission-cf49989db-t7n7v" Apr 24 21:32:54.578304 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.578287 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:32:54.578304 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.578306 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:32:54.578457 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.578318 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-p8cdh: references non-existent secret key: ca.crt Apr 24 21:32:54.578457 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.578374 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates podName:30213c3b-8f1b-4f0a-98d9-6cf15887e70d nodeName:}" failed. No retries permitted until 2026-04-24 21:32:55.578357229 +0000 UTC m=+313.676769735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates") pod "keda-operator-ffbb595cb-p8cdh" (UID: "30213c3b-8f1b-4f0a-98d9-6cf15887e70d") : references non-existent secret key: ca.crt Apr 24 21:32:54.580602 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.580567 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6ca28e08-42c1-44b3-8fe8-030c8238b887-certificates\") pod \"keda-admission-cf49989db-t7n7v\" (UID: \"6ca28e08-42c1-44b3-8fe8-030c8238b887\") " pod="openshift-keda/keda-admission-cf49989db-t7n7v" Apr 24 21:32:54.591321 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.591268 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkbzv\" (UniqueName: \"kubernetes.io/projected/6ca28e08-42c1-44b3-8fe8-030c8238b887-kube-api-access-vkbzv\") pod \"keda-admission-cf49989db-t7n7v\" (UID: \"6ca28e08-42c1-44b3-8fe8-030c8238b887\") " pod="openshift-keda/keda-admission-cf49989db-t7n7v" Apr 24 21:32:54.710952 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.710912 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-t7n7v" Apr 24 21:32:54.781446 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.779813 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:54.781446 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.779981 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:32:54.781446 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.779998 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:32:54.781446 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.780019 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl: references non-existent secret key: tls.crt Apr 24 21:32:54.781446 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:54.780084 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates podName:57fac6da-40a1-403b-85dc-2a59c73b86a9 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:55.780066781 +0000 UTC m=+313.878479290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates") pod "keda-metrics-apiserver-7c9f485588-lnrsl" (UID: "57fac6da-40a1-403b-85dc-2a59c73b86a9") : references non-existent secret key: tls.crt Apr 24 21:32:54.867979 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:54.867950 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-t7n7v"] Apr 24 21:32:54.870791 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:32:54.870758 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ca28e08_42c1_44b3_8fe8_030c8238b887.slice/crio-ff344ecdb876edb88c118ae7ec634a30a34b15bb9d8018a5ad101ba3b6c4186e WatchSource:0}: Error finding container ff344ecdb876edb88c118ae7ec634a30a34b15bb9d8018a5ad101ba3b6c4186e: Status 404 returned error can't find the container with id ff344ecdb876edb88c118ae7ec634a30a34b15bb9d8018a5ad101ba3b6c4186e Apr 24 21:32:55.424638 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:55.424606 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-t7n7v" event={"ID":"6ca28e08-42c1-44b3-8fe8-030c8238b887","Type":"ContainerStarted","Data":"ff344ecdb876edb88c118ae7ec634a30a34b15bb9d8018a5ad101ba3b6c4186e"} Apr 24 21:32:55.586893 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:55.586851 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:55.587325 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:55.586975 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:32:55.587325 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:55.586996 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:32:55.587325 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:55.587010 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-p8cdh: references non-existent secret key: ca.crt Apr 24 21:32:55.587325 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:55.587072 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates podName:30213c3b-8f1b-4f0a-98d9-6cf15887e70d nodeName:}" failed. No retries permitted until 2026-04-24 21:32:57.587053678 +0000 UTC m=+315.685466186 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates") pod "keda-operator-ffbb595cb-p8cdh" (UID: "30213c3b-8f1b-4f0a-98d9-6cf15887e70d") : references non-existent secret key: ca.crt Apr 24 21:32:55.788566 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:55.788483 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:55.788735 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:55.788652 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:32:55.788735 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:55.788673 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:32:55.788735 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:55.788698 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl: references non-existent secret key: tls.crt Apr 24 21:32:55.788877 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:55.788759 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates podName:57fac6da-40a1-403b-85dc-2a59c73b86a9 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:57.788743425 +0000 UTC m=+315.887155916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates") pod "keda-metrics-apiserver-7c9f485588-lnrsl" (UID: "57fac6da-40a1-403b-85dc-2a59c73b86a9") : references non-existent secret key: tls.crt Apr 24 21:32:56.428543 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:56.428509 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-t7n7v" event={"ID":"6ca28e08-42c1-44b3-8fe8-030c8238b887","Type":"ContainerStarted","Data":"e6c40b7ff4f240fe48b6fc2671086977122383b9ef5f6f0c575dc26cc52d2584"} Apr 24 21:32:56.428754 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:56.428632 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-t7n7v" Apr 24 21:32:57.604997 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:57.604966 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:32:57.605423 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:57.605112 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:32:57.605423 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:57.605129 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:32:57.605423 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:57.605138 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-p8cdh: references non-existent secret key: ca.crt Apr 24 21:32:57.605423 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:57.605191 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates podName:30213c3b-8f1b-4f0a-98d9-6cf15887e70d nodeName:}" failed. No retries permitted until 2026-04-24 21:33:01.605177419 +0000 UTC m=+319.703589910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates") pod "keda-operator-ffbb595cb-p8cdh" (UID: "30213c3b-8f1b-4f0a-98d9-6cf15887e70d") : references non-existent secret key: ca.crt Apr 24 21:32:57.806426 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:32:57.806389 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:32:57.806623 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:57.806520 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:32:57.806623 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:57.806537 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:32:57.806623 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:57.806554 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl: references non-existent secret key: tls.crt Apr 24 21:32:57.806784 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:32:57.806625 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates podName:57fac6da-40a1-403b-85dc-2a59c73b86a9 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:01.806603383 +0000 UTC m=+319.905015874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates") pod "keda-metrics-apiserver-7c9f485588-lnrsl" (UID: "57fac6da-40a1-403b-85dc-2a59c73b86a9") : references non-existent secret key: tls.crt Apr 24 21:33:01.639455 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:01.639416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:33:01.642000 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:01.641971 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30213c3b-8f1b-4f0a-98d9-6cf15887e70d-certificates\") pod \"keda-operator-ffbb595cb-p8cdh\" (UID: \"30213c3b-8f1b-4f0a-98d9-6cf15887e70d\") " pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:33:01.672298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:01.672268 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:33:01.791358 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:01.791322 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-t7n7v" podStartSLOduration=6.416722731 podStartE2EDuration="7.791307225s" podCreationTimestamp="2026-04-24 21:32:54 +0000 UTC" firstStartedPulling="2026-04-24 21:32:54.874060044 +0000 UTC m=+312.972472540" lastFinishedPulling="2026-04-24 21:32:56.248644538 +0000 UTC m=+314.347057034" observedRunningTime="2026-04-24 21:32:56.450462531 +0000 UTC m=+314.548875043" watchObservedRunningTime="2026-04-24 21:33:01.791307225 +0000 UTC m=+319.889719738" Apr 24 21:33:01.791641 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:01.791628 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-p8cdh"] Apr 24 21:33:01.793494 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:33:01.793463 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30213c3b_8f1b_4f0a_98d9_6cf15887e70d.slice/crio-26ea36ece9ecc3db525e47643ee1a92cab89978008ec0d90312e9355463d7461 WatchSource:0}: Error finding container 26ea36ece9ecc3db525e47643ee1a92cab89978008ec0d90312e9355463d7461: Status 404 returned error can't find the container with id 26ea36ece9ecc3db525e47643ee1a92cab89978008ec0d90312e9355463d7461 Apr 24 21:33:01.841661 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:01.841622 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:33:01.843998 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:01.843971 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/57fac6da-40a1-403b-85dc-2a59c73b86a9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lnrsl\" (UID: \"57fac6da-40a1-403b-85dc-2a59c73b86a9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:33:01.913751 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:01.913730 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:33:02.028554 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:02.028452 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl"] Apr 24 21:33:02.031421 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:33:02.031393 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57fac6da_40a1_403b_85dc_2a59c73b86a9.slice/crio-f8637fbf4630e6a628ce710309f0bee53b03307c3abc663f2e39b80eb58281d2 WatchSource:0}: Error finding container f8637fbf4630e6a628ce710309f0bee53b03307c3abc663f2e39b80eb58281d2: Status 404 returned error can't find the container with id f8637fbf4630e6a628ce710309f0bee53b03307c3abc663f2e39b80eb58281d2 Apr 24 21:33:02.449295 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:02.449260 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" event={"ID":"30213c3b-8f1b-4f0a-98d9-6cf15887e70d","Type":"ContainerStarted","Data":"26ea36ece9ecc3db525e47643ee1a92cab89978008ec0d90312e9355463d7461"} Apr 24 21:33:02.450529 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:02.450428 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" event={"ID":"57fac6da-40a1-403b-85dc-2a59c73b86a9","Type":"ContainerStarted","Data":"f8637fbf4630e6a628ce710309f0bee53b03307c3abc663f2e39b80eb58281d2"} Apr 24 21:33:06.464537 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:06.464502 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" event={"ID":"30213c3b-8f1b-4f0a-98d9-6cf15887e70d","Type":"ContainerStarted","Data":"2c73f9df76e5ad653ed1e027890e9ee17c2d32e00f6919c9c5371f5ea1b9e132"} Apr 24 21:33:06.465053 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:06.464633 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:33:06.465915 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:06.465893 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" event={"ID":"57fac6da-40a1-403b-85dc-2a59c73b86a9","Type":"ContainerStarted","Data":"cab25f72c02c6ef0eb35a92b8cb5182e5a2dbc891a14328c8579c572399b9368"} Apr 24 21:33:06.466037 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:06.466010 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:33:06.485145 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:06.485098 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" podStartSLOduration=9.863793608 podStartE2EDuration="13.485086729s" podCreationTimestamp="2026-04-24 21:32:53 +0000 UTC" firstStartedPulling="2026-04-24 21:33:01.794725272 +0000 UTC m=+319.893137763" lastFinishedPulling="2026-04-24 21:33:05.416018379 +0000 UTC m=+323.514430884" observedRunningTime="2026-04-24 21:33:06.483645392 +0000 UTC m=+324.582057927" watchObservedRunningTime="2026-04-24 21:33:06.485086729 +0000 UTC m=+324.583499241" Apr 24 21:33:06.504839 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:06.504793 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" podStartSLOduration=9.125588806 podStartE2EDuration="12.504781249s" podCreationTimestamp="2026-04-24 21:32:54 +0000 UTC" firstStartedPulling="2026-04-24 21:33:02.032717429 +0000 UTC m=+320.131129920" lastFinishedPulling="2026-04-24 21:33:05.411909867 +0000 UTC m=+323.510322363" observedRunningTime="2026-04-24 21:33:06.503520434 +0000 UTC m=+324.601932944" watchObservedRunningTime="2026-04-24 21:33:06.504781249 +0000 UTC m=+324.603193761" Apr 24 21:33:14.422319 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:14.422292 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jbf4b" Apr 24 21:33:17.437302 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:17.437272 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-t7n7v" Apr 24 21:33:17.473387 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:17.473365 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lnrsl" Apr 24 21:33:27.471265 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:27.471235 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-p8cdh" Apr 24 21:33:46.224946 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.224907 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5"] Apr 24 21:33:46.235254 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.235228 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.236007 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.235979 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5"] Apr 24 21:33:46.237731 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.237713 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-58qm5\"" Apr 24 21:33:46.237840 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.237788 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:33:46.238807 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.238783 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:33:46.270736 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.270712 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.270831 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.270749 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w66xk\" (UniqueName: \"kubernetes.io/projected/223ab951-1e05-44a2-bfb2-1686dd0b5f06-kube-api-access-w66xk\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.270831 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.270774 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.371176 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.371146 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.371298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.371188 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w66xk\" (UniqueName: \"kubernetes.io/projected/223ab951-1e05-44a2-bfb2-1686dd0b5f06-kube-api-access-w66xk\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.371298 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.371214 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.371553 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.371538 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.371605 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.371536 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.381393 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.381361 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w66xk\" (UniqueName: \"kubernetes.io/projected/223ab951-1e05-44a2-bfb2-1686dd0b5f06-kube-api-access-w66xk\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.544540 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.544459 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:46.663648 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:46.663620 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5"] Apr 24 21:33:46.666898 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:33:46.666873 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod223ab951_1e05_44a2_bfb2_1686dd0b5f06.slice/crio-bc3a2d9e979e6e8b002be726d9dfb0f2da8965100c6ffa9b35ab95310cd0087a WatchSource:0}: Error finding container bc3a2d9e979e6e8b002be726d9dfb0f2da8965100c6ffa9b35ab95310cd0087a: Status 404 returned error can't find the container with id bc3a2d9e979e6e8b002be726d9dfb0f2da8965100c6ffa9b35ab95310cd0087a Apr 24 21:33:47.586387 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:47.586353 2567 generic.go:358] "Generic (PLEG): container finished" podID="223ab951-1e05-44a2-bfb2-1686dd0b5f06" containerID="bd275095de3f5bf3f08b28c8c24189a20f4e609bfd365394069edbc6e395f6c5" exitCode=0 Apr 24 21:33:47.586766 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:47.586427 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" event={"ID":"223ab951-1e05-44a2-bfb2-1686dd0b5f06","Type":"ContainerDied","Data":"bd275095de3f5bf3f08b28c8c24189a20f4e609bfd365394069edbc6e395f6c5"} Apr 24 21:33:47.586766 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:47.586465 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" event={"ID":"223ab951-1e05-44a2-bfb2-1686dd0b5f06","Type":"ContainerStarted","Data":"bc3a2d9e979e6e8b002be726d9dfb0f2da8965100c6ffa9b35ab95310cd0087a"} Apr 24 21:33:49.594685 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:49.594658 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" event={"ID":"223ab951-1e05-44a2-bfb2-1686dd0b5f06","Type":"ContainerStarted","Data":"b1442e8473b16930f861f087a683a4ee202196c418cd74dcf5ccb6d2ec187e0f"} Apr 24 21:33:50.598517 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:50.598480 2567 generic.go:358] "Generic (PLEG): container finished" podID="223ab951-1e05-44a2-bfb2-1686dd0b5f06" containerID="b1442e8473b16930f861f087a683a4ee202196c418cd74dcf5ccb6d2ec187e0f" exitCode=0 Apr 24 21:33:50.598919 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:50.598562 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" event={"ID":"223ab951-1e05-44a2-bfb2-1686dd0b5f06","Type":"ContainerDied","Data":"b1442e8473b16930f861f087a683a4ee202196c418cd74dcf5ccb6d2ec187e0f"} Apr 24 21:33:51.603398 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:51.603364 2567 generic.go:358] "Generic (PLEG): container finished" podID="223ab951-1e05-44a2-bfb2-1686dd0b5f06" containerID="482cd8414f83b6e5cd28ae51648c2f5e0c911495a3dc24e8cc9d9a032047c9ef" exitCode=0 Apr 24 21:33:51.603799 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:51.603430 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" event={"ID":"223ab951-1e05-44a2-bfb2-1686dd0b5f06","Type":"ContainerDied","Data":"482cd8414f83b6e5cd28ae51648c2f5e0c911495a3dc24e8cc9d9a032047c9ef"} Apr 24 21:33:52.719353 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:52.719330 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:52.820835 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:52.820809 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w66xk\" (UniqueName: \"kubernetes.io/projected/223ab951-1e05-44a2-bfb2-1686dd0b5f06-kube-api-access-w66xk\") pod \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " Apr 24 21:33:52.821005 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:52.820868 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-bundle\") pod \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " Apr 24 21:33:52.821005 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:52.820930 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-util\") pod \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\" (UID: \"223ab951-1e05-44a2-bfb2-1686dd0b5f06\") " Apr 24 21:33:52.821596 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:52.821555 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-bundle" (OuterVolumeSpecName: "bundle") pod "223ab951-1e05-44a2-bfb2-1686dd0b5f06" (UID: "223ab951-1e05-44a2-bfb2-1686dd0b5f06"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:52.822933 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:52.822904 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223ab951-1e05-44a2-bfb2-1686dd0b5f06-kube-api-access-w66xk" (OuterVolumeSpecName: "kube-api-access-w66xk") pod "223ab951-1e05-44a2-bfb2-1686dd0b5f06" (UID: "223ab951-1e05-44a2-bfb2-1686dd0b5f06"). InnerVolumeSpecName "kube-api-access-w66xk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:52.828201 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:52.828158 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-util" (OuterVolumeSpecName: "util") pod "223ab951-1e05-44a2-bfb2-1686dd0b5f06" (UID: "223ab951-1e05-44a2-bfb2-1686dd0b5f06"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:52.922243 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:52.922217 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-util\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:33:52.922344 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:52.922244 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w66xk\" (UniqueName: \"kubernetes.io/projected/223ab951-1e05-44a2-bfb2-1686dd0b5f06-kube-api-access-w66xk\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:33:52.922344 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:52.922255 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/223ab951-1e05-44a2-bfb2-1686dd0b5f06-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:33:53.611366 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:53.611341 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" Apr 24 21:33:53.611533 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:53.611338 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dgm4l5" event={"ID":"223ab951-1e05-44a2-bfb2-1686dd0b5f06","Type":"ContainerDied","Data":"bc3a2d9e979e6e8b002be726d9dfb0f2da8965100c6ffa9b35ab95310cd0087a"} Apr 24 21:33:53.611533 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:53.611448 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc3a2d9e979e6e8b002be726d9dfb0f2da8965100c6ffa9b35ab95310cd0087a" Apr 24 21:33:59.231710 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.231676 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg"] Apr 24 21:33:59.232072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.231973 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="223ab951-1e05-44a2-bfb2-1686dd0b5f06" containerName="pull" Apr 24 21:33:59.232072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.231984 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="223ab951-1e05-44a2-bfb2-1686dd0b5f06" containerName="pull" Apr 24 21:33:59.232072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.231997 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="223ab951-1e05-44a2-bfb2-1686dd0b5f06" containerName="util" Apr 24 21:33:59.232072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.232002 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="223ab951-1e05-44a2-bfb2-1686dd0b5f06" containerName="util" Apr 24 21:33:59.232072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.232017 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="223ab951-1e05-44a2-bfb2-1686dd0b5f06" containerName="extract" Apr 24 21:33:59.232072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.232022 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="223ab951-1e05-44a2-bfb2-1686dd0b5f06" containerName="extract" Apr 24 21:33:59.232072 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.232067 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="223ab951-1e05-44a2-bfb2-1686dd0b5f06" containerName="extract" Apr 24 21:33:59.237035 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.237020 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" Apr 24 21:33:59.239668 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.239641 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:33:59.239668 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.239657 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 24 21:33:59.239833 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.239660 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-j822f\"" Apr 24 21:33:59.247874 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.247853 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg"] Apr 24 21:33:59.267476 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.267452 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cfcf5ce4-bdab-4989-94e2-5135b1aa2462-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-bz5mg\" (UID: \"cfcf5ce4-bdab-4989-94e2-5135b1aa2462\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" Apr 24 21:33:59.267599 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.267490 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpz49\" (UniqueName: \"kubernetes.io/projected/cfcf5ce4-bdab-4989-94e2-5135b1aa2462-kube-api-access-hpz49\") pod \"cert-manager-operator-controller-manager-54b9655956-bz5mg\" (UID: \"cfcf5ce4-bdab-4989-94e2-5135b1aa2462\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" Apr 24 21:33:59.368770 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.368736 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cfcf5ce4-bdab-4989-94e2-5135b1aa2462-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-bz5mg\" (UID: \"cfcf5ce4-bdab-4989-94e2-5135b1aa2462\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" Apr 24 21:33:59.368934 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.368781 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpz49\" (UniqueName: \"kubernetes.io/projected/cfcf5ce4-bdab-4989-94e2-5135b1aa2462-kube-api-access-hpz49\") pod \"cert-manager-operator-controller-manager-54b9655956-bz5mg\" (UID: \"cfcf5ce4-bdab-4989-94e2-5135b1aa2462\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" Apr 24 21:33:59.369093 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.369075 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cfcf5ce4-bdab-4989-94e2-5135b1aa2462-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-bz5mg\" (UID: \"cfcf5ce4-bdab-4989-94e2-5135b1aa2462\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" Apr 24 21:33:59.385042 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.385017 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpz49\" (UniqueName: \"kubernetes.io/projected/cfcf5ce4-bdab-4989-94e2-5135b1aa2462-kube-api-access-hpz49\") pod \"cert-manager-operator-controller-manager-54b9655956-bz5mg\" (UID: \"cfcf5ce4-bdab-4989-94e2-5135b1aa2462\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" Apr 24 21:33:59.546165 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.546077 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" Apr 24 21:33:59.670633 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:33:59.670607 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg"] Apr 24 21:33:59.673926 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:33:59.673899 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfcf5ce4_bdab_4989_94e2_5135b1aa2462.slice/crio-6fee8ccd3782bab887b59b090d90bdf70dfdd0b580d30b9335e59168cfb70908 WatchSource:0}: Error finding container 6fee8ccd3782bab887b59b090d90bdf70dfdd0b580d30b9335e59168cfb70908: Status 404 returned error can't find the container with id 6fee8ccd3782bab887b59b090d90bdf70dfdd0b580d30b9335e59168cfb70908 Apr 24 21:34:00.635541 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:00.635501 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" event={"ID":"cfcf5ce4-bdab-4989-94e2-5135b1aa2462","Type":"ContainerStarted","Data":"6fee8ccd3782bab887b59b090d90bdf70dfdd0b580d30b9335e59168cfb70908"} Apr 24 21:34:01.640173 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:01.640137 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" event={"ID":"cfcf5ce4-bdab-4989-94e2-5135b1aa2462","Type":"ContainerStarted","Data":"a4b28da14a896388512ed903f8d494b1f18c5d67489689d15ac6ad503214816b"} Apr 24 21:34:01.675854 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:01.675800 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-bz5mg" podStartSLOduration=1.441540221 podStartE2EDuration="2.675782616s" podCreationTimestamp="2026-04-24 21:33:59 +0000 UTC" firstStartedPulling="2026-04-24 21:33:59.676353182 +0000 UTC m=+377.774765673" lastFinishedPulling="2026-04-24 21:34:00.910595552 +0000 UTC m=+379.009008068" observedRunningTime="2026-04-24 21:34:01.672428986 +0000 UTC m=+379.770841524" watchObservedRunningTime="2026-04-24 21:34:01.675782616 +0000 UTC m=+379.774195131" Apr 24 21:34:08.163692 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.160766 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4"] Apr 24 21:34:08.166068 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.166039 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.171663 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.171641 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:34:08.171785 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.171642 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:34:08.172659 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.172537 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-58qm5\"" Apr 24 21:34:08.201425 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.201403 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4"] Apr 24 21:34:08.235018 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.234989 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.235137 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.235029 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.235137 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.235049 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7xk5\" (UniqueName: \"kubernetes.io/projected/391c0d82-f737-4687-9708-ee5c75fa5e5a-kube-api-access-g7xk5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.336035 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.336005 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.336167 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.336047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.336167 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.336067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7xk5\" (UniqueName: \"kubernetes.io/projected/391c0d82-f737-4687-9708-ee5c75fa5e5a-kube-api-access-g7xk5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.336404 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.336386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.336479 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.336411 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.348209 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.348181 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7xk5\" (UniqueName: \"kubernetes.io/projected/391c0d82-f737-4687-9708-ee5c75fa5e5a-kube-api-access-g7xk5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.475712 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.475636 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:08.603965 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.603937 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4"] Apr 24 21:34:08.607330 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:34:08.607295 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod391c0d82_f737_4687_9708_ee5c75fa5e5a.slice/crio-effe6163ea08419f9347e53b578a0f77b3fe50d50c98d73fe168495bb7c2ce2a WatchSource:0}: Error finding container effe6163ea08419f9347e53b578a0f77b3fe50d50c98d73fe168495bb7c2ce2a: Status 404 returned error can't find the container with id effe6163ea08419f9347e53b578a0f77b3fe50d50c98d73fe168495bb7c2ce2a Apr 24 21:34:08.663570 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:08.663545 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" event={"ID":"391c0d82-f737-4687-9708-ee5c75fa5e5a","Type":"ContainerStarted","Data":"effe6163ea08419f9347e53b578a0f77b3fe50d50c98d73fe168495bb7c2ce2a"} Apr 24 21:34:09.668592 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:09.668546 2567 generic.go:358] "Generic (PLEG): container finished" podID="391c0d82-f737-4687-9708-ee5c75fa5e5a" containerID="63e1a3e986e281279f0e6a0d83f4651e257fc75db2fa2d3f292ba9e2fa3eb16c" exitCode=0 Apr 24 21:34:09.668981 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:09.668622 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" event={"ID":"391c0d82-f737-4687-9708-ee5c75fa5e5a","Type":"ContainerDied","Data":"63e1a3e986e281279f0e6a0d83f4651e257fc75db2fa2d3f292ba9e2fa3eb16c"} Apr 24 21:34:11.677920 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:11.677887 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" event={"ID":"391c0d82-f737-4687-9708-ee5c75fa5e5a","Type":"ContainerStarted","Data":"b211e87484df570ab9c36e53b7c836289324c1ff31b8d3a68971cab211f425fe"} Apr 24 21:34:12.683011 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:12.682974 2567 generic.go:358] "Generic (PLEG): container finished" podID="391c0d82-f737-4687-9708-ee5c75fa5e5a" containerID="b211e87484df570ab9c36e53b7c836289324c1ff31b8d3a68971cab211f425fe" exitCode=0 Apr 24 21:34:12.683404 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:12.683062 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" event={"ID":"391c0d82-f737-4687-9708-ee5c75fa5e5a","Type":"ContainerDied","Data":"b211e87484df570ab9c36e53b7c836289324c1ff31b8d3a68971cab211f425fe"} Apr 24 21:34:13.687786 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:13.687749 2567 generic.go:358] "Generic (PLEG): container finished" podID="391c0d82-f737-4687-9708-ee5c75fa5e5a" containerID="6fbcc3be3ee9881b285bf5373e91d438fe4567016951dabc998d70e73fb137ed" exitCode=0 Apr 24 21:34:13.688153 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:13.687833 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" event={"ID":"391c0d82-f737-4687-9708-ee5c75fa5e5a","Type":"ContainerDied","Data":"6fbcc3be3ee9881b285bf5373e91d438fe4567016951dabc998d70e73fb137ed"} Apr 24 21:34:14.812234 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:14.812213 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:14.886873 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:14.886841 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-util\") pod \"391c0d82-f737-4687-9708-ee5c75fa5e5a\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " Apr 24 21:34:14.887021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:14.886901 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-bundle\") pod \"391c0d82-f737-4687-9708-ee5c75fa5e5a\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " Apr 24 21:34:14.887021 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:14.886933 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7xk5\" (UniqueName: \"kubernetes.io/projected/391c0d82-f737-4687-9708-ee5c75fa5e5a-kube-api-access-g7xk5\") pod \"391c0d82-f737-4687-9708-ee5c75fa5e5a\" (UID: \"391c0d82-f737-4687-9708-ee5c75fa5e5a\") " Apr 24 21:34:14.887262 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:14.887239 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-bundle" (OuterVolumeSpecName: "bundle") pod "391c0d82-f737-4687-9708-ee5c75fa5e5a" (UID: "391c0d82-f737-4687-9708-ee5c75fa5e5a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:34:14.889083 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:14.889056 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391c0d82-f737-4687-9708-ee5c75fa5e5a-kube-api-access-g7xk5" (OuterVolumeSpecName: "kube-api-access-g7xk5") pod "391c0d82-f737-4687-9708-ee5c75fa5e5a" (UID: "391c0d82-f737-4687-9708-ee5c75fa5e5a"). InnerVolumeSpecName "kube-api-access-g7xk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:14.890866 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:14.890852 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-util" (OuterVolumeSpecName: "util") pod "391c0d82-f737-4687-9708-ee5c75fa5e5a" (UID: "391c0d82-f737-4687-9708-ee5c75fa5e5a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:34:14.987695 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:14.987626 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:34:14.987695 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:14.987651 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g7xk5\" (UniqueName: \"kubernetes.io/projected/391c0d82-f737-4687-9708-ee5c75fa5e5a-kube-api-access-g7xk5\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:34:14.987695 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:14.987661 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/391c0d82-f737-4687-9708-ee5c75fa5e5a-util\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:34:15.695488 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:15.695455 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" Apr 24 21:34:15.695647 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:15.695461 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frktn4" event={"ID":"391c0d82-f737-4687-9708-ee5c75fa5e5a","Type":"ContainerDied","Data":"effe6163ea08419f9347e53b578a0f77b3fe50d50c98d73fe168495bb7c2ce2a"} Apr 24 21:34:15.695647 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:15.695562 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="effe6163ea08419f9347e53b578a0f77b3fe50d50c98d73fe168495bb7c2ce2a" Apr 24 21:34:20.777565 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.777530 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k"] Apr 24 21:34:20.778002 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.777824 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="391c0d82-f737-4687-9708-ee5c75fa5e5a" containerName="util" Apr 24 21:34:20.778002 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.777835 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="391c0d82-f737-4687-9708-ee5c75fa5e5a" containerName="util" Apr 24 21:34:20.778002 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.777850 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="391c0d82-f737-4687-9708-ee5c75fa5e5a" containerName="pull" Apr 24 21:34:20.778002 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.777855 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="391c0d82-f737-4687-9708-ee5c75fa5e5a" containerName="pull" Apr 24 21:34:20.778002 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.777864 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="391c0d82-f737-4687-9708-ee5c75fa5e5a" containerName="extract" Apr 24 21:34:20.778002 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.777870 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="391c0d82-f737-4687-9708-ee5c75fa5e5a" containerName="extract" Apr 24 21:34:20.778002 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.777917 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="391c0d82-f737-4687-9708-ee5c75fa5e5a" containerName="extract" Apr 24 21:34:20.782653 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.782637 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" Apr 24 21:34:20.785187 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.785156 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:34:20.785313 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.785199 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-6qnqw\"" Apr 24 21:34:20.786124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.786110 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 24 21:34:20.795185 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.795158 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k"] Apr 24 21:34:20.835123 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.835089 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4eec04f0-c060-4069-9a58-3942ba1a27cc-tmp\") pod \"openshift-lws-operator-bfc7f696d-z7n4k\" (UID: \"4eec04f0-c060-4069-9a58-3942ba1a27cc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" Apr 24 21:34:20.835268 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.835132 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg7z9\" (UniqueName: \"kubernetes.io/projected/4eec04f0-c060-4069-9a58-3942ba1a27cc-kube-api-access-sg7z9\") pod \"openshift-lws-operator-bfc7f696d-z7n4k\" (UID: \"4eec04f0-c060-4069-9a58-3942ba1a27cc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" Apr 24 21:34:20.936046 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.935999 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4eec04f0-c060-4069-9a58-3942ba1a27cc-tmp\") pod \"openshift-lws-operator-bfc7f696d-z7n4k\" (UID: \"4eec04f0-c060-4069-9a58-3942ba1a27cc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" Apr 24 21:34:20.936046 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.936047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg7z9\" (UniqueName: \"kubernetes.io/projected/4eec04f0-c060-4069-9a58-3942ba1a27cc-kube-api-access-sg7z9\") pod \"openshift-lws-operator-bfc7f696d-z7n4k\" (UID: \"4eec04f0-c060-4069-9a58-3942ba1a27cc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" Apr 24 21:34:20.936422 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.936403 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4eec04f0-c060-4069-9a58-3942ba1a27cc-tmp\") pod \"openshift-lws-operator-bfc7f696d-z7n4k\" (UID: \"4eec04f0-c060-4069-9a58-3942ba1a27cc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" Apr 24 21:34:20.947153 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:20.947130 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg7z9\" (UniqueName: \"kubernetes.io/projected/4eec04f0-c060-4069-9a58-3942ba1a27cc-kube-api-access-sg7z9\") pod \"openshift-lws-operator-bfc7f696d-z7n4k\" (UID: \"4eec04f0-c060-4069-9a58-3942ba1a27cc\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" Apr 24 21:34:21.100604 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:21.100496 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" Apr 24 21:34:21.224079 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:21.224055 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k"] Apr 24 21:34:21.226159 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:34:21.226131 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eec04f0_c060_4069_9a58_3942ba1a27cc.slice/crio-101f759d5c08eafdbfb88ecae4cc4fafce2a526d27e4915360274366bf0c1920 WatchSource:0}: Error finding container 101f759d5c08eafdbfb88ecae4cc4fafce2a526d27e4915360274366bf0c1920: Status 404 returned error can't find the container with id 101f759d5c08eafdbfb88ecae4cc4fafce2a526d27e4915360274366bf0c1920 Apr 24 21:34:21.716398 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:21.716358 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" event={"ID":"4eec04f0-c060-4069-9a58-3942ba1a27cc","Type":"ContainerStarted","Data":"101f759d5c08eafdbfb88ecae4cc4fafce2a526d27e4915360274366bf0c1920"} Apr 24 21:34:23.724749 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:23.724652 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" event={"ID":"4eec04f0-c060-4069-9a58-3942ba1a27cc","Type":"ContainerStarted","Data":"d3aecda24e28f98326913238032f34c53f79d1a303ca06172c545c311c451458"} Apr 24 21:34:23.744480 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:23.744433 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-z7n4k" podStartSLOduration=1.510715003 podStartE2EDuration="3.744420028s" podCreationTimestamp="2026-04-24 21:34:20 +0000 UTC" firstStartedPulling="2026-04-24 21:34:21.22759208 +0000 UTC m=+399.326004585" lastFinishedPulling="2026-04-24 21:34:23.461297115 +0000 UTC m=+401.559709610" observedRunningTime="2026-04-24 21:34:23.742118992 +0000 UTC m=+401.840531505" watchObservedRunningTime="2026-04-24 21:34:23.744420028 +0000 UTC m=+401.842832540" Apr 24 21:34:34.412535 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.412489 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm"] Apr 24 21:34:34.416161 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.416134 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.418779 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.418744 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:34:34.419549 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.419527 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:34:34.419636 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.419551 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-58qm5\"" Apr 24 21:34:34.426785 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.426760 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm"] Apr 24 21:34:34.553868 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.553827 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.554039 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.553881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82kzj\" (UniqueName: \"kubernetes.io/projected/e095c361-1c49-404e-a7a9-8f81dc7d2b70-kube-api-access-82kzj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.554039 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.553927 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.655002 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.654960 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.655183 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.655012 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82kzj\" (UniqueName: \"kubernetes.io/projected/e095c361-1c49-404e-a7a9-8f81dc7d2b70-kube-api-access-82kzj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.655183 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.655039 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.655358 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.655336 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.655394 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.655365 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.666500 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.666446 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82kzj\" (UniqueName: \"kubernetes.io/projected/e095c361-1c49-404e-a7a9-8f81dc7d2b70-kube-api-access-82kzj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.726199 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.726170 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:34.846942 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:34.846912 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm"] Apr 24 21:34:34.850135 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:34:34.850107 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode095c361_1c49_404e_a7a9_8f81dc7d2b70.slice/crio-54028659e200658babc33eb04b9c936b67b30867fc055ed9d98d0f413b4dccc0 WatchSource:0}: Error finding container 54028659e200658babc33eb04b9c936b67b30867fc055ed9d98d0f413b4dccc0: Status 404 returned error can't find the container with id 54028659e200658babc33eb04b9c936b67b30867fc055ed9d98d0f413b4dccc0 Apr 24 21:34:35.762672 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:35.762634 2567 generic.go:358] "Generic (PLEG): container finished" podID="e095c361-1c49-404e-a7a9-8f81dc7d2b70" containerID="11742be72c92899d92265c8defb164cbcea7b5c7be50a7d84ebdd46fe1591869" exitCode=0 Apr 24 21:34:35.763036 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:35.762719 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" event={"ID":"e095c361-1c49-404e-a7a9-8f81dc7d2b70","Type":"ContainerDied","Data":"11742be72c92899d92265c8defb164cbcea7b5c7be50a7d84ebdd46fe1591869"} Apr 24 21:34:35.763036 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:35.762751 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" event={"ID":"e095c361-1c49-404e-a7a9-8f81dc7d2b70","Type":"ContainerStarted","Data":"54028659e200658babc33eb04b9c936b67b30867fc055ed9d98d0f413b4dccc0"} Apr 24 21:34:36.767189 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:36.767149 2567 generic.go:358] "Generic (PLEG): container finished" podID="e095c361-1c49-404e-a7a9-8f81dc7d2b70" containerID="96861765e25c956e155ee0fea8a01a1b685a255f8039b71521ac861b44b08cd3" exitCode=0 Apr 24 21:34:36.767573 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:36.767236 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" event={"ID":"e095c361-1c49-404e-a7a9-8f81dc7d2b70","Type":"ContainerDied","Data":"96861765e25c956e155ee0fea8a01a1b685a255f8039b71521ac861b44b08cd3"} Apr 24 21:34:37.771927 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:37.771895 2567 generic.go:358] "Generic (PLEG): container finished" podID="e095c361-1c49-404e-a7a9-8f81dc7d2b70" containerID="8d49bc54e90bcdf8e7d8d06ba5bcb5b757a7ef667bf15bf04d54f2ddbcd2e50b" exitCode=0 Apr 24 21:34:37.772286 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:37.771966 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" event={"ID":"e095c361-1c49-404e-a7a9-8f81dc7d2b70","Type":"ContainerDied","Data":"8d49bc54e90bcdf8e7d8d06ba5bcb5b757a7ef667bf15bf04d54f2ddbcd2e50b"} Apr 24 21:34:38.898531 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:38.898509 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:38.987172 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:38.987142 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-util\") pod \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " Apr 24 21:34:38.987326 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:38.987209 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-bundle\") pod \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " Apr 24 21:34:38.987326 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:38.987243 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82kzj\" (UniqueName: \"kubernetes.io/projected/e095c361-1c49-404e-a7a9-8f81dc7d2b70-kube-api-access-82kzj\") pod \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\" (UID: \"e095c361-1c49-404e-a7a9-8f81dc7d2b70\") " Apr 24 21:34:38.988054 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:38.988024 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-bundle" (OuterVolumeSpecName: "bundle") pod "e095c361-1c49-404e-a7a9-8f81dc7d2b70" (UID: "e095c361-1c49-404e-a7a9-8f81dc7d2b70"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:34:38.989260 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:38.989231 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e095c361-1c49-404e-a7a9-8f81dc7d2b70-kube-api-access-82kzj" (OuterVolumeSpecName: "kube-api-access-82kzj") pod "e095c361-1c49-404e-a7a9-8f81dc7d2b70" (UID: "e095c361-1c49-404e-a7a9-8f81dc7d2b70"). InnerVolumeSpecName "kube-api-access-82kzj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:38.992603 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:38.992557 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-util" (OuterVolumeSpecName: "util") pod "e095c361-1c49-404e-a7a9-8f81dc7d2b70" (UID: "e095c361-1c49-404e-a7a9-8f81dc7d2b70"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:34:39.087796 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:39.087702 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-util\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:34:39.087796 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:39.087744 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e095c361-1c49-404e-a7a9-8f81dc7d2b70-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:34:39.087796 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:39.087755 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-82kzj\" (UniqueName: \"kubernetes.io/projected/e095c361-1c49-404e-a7a9-8f81dc7d2b70-kube-api-access-82kzj\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:34:39.780593 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:39.780562 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" Apr 24 21:34:39.780798 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:39.780564 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48357snbm" event={"ID":"e095c361-1c49-404e-a7a9-8f81dc7d2b70","Type":"ContainerDied","Data":"54028659e200658babc33eb04b9c936b67b30867fc055ed9d98d0f413b4dccc0"} Apr 24 21:34:39.780798 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:39.780682 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54028659e200658babc33eb04b9c936b67b30867fc055ed9d98d0f413b4dccc0" Apr 24 21:34:48.072519 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.072486 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh"] Apr 24 21:34:48.073137 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.073113 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e095c361-1c49-404e-a7a9-8f81dc7d2b70" containerName="util" Apr 24 21:34:48.073137 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.073136 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e095c361-1c49-404e-a7a9-8f81dc7d2b70" containerName="util" Apr 24 21:34:48.073347 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.073177 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e095c361-1c49-404e-a7a9-8f81dc7d2b70" containerName="pull" Apr 24 21:34:48.073347 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.073186 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e095c361-1c49-404e-a7a9-8f81dc7d2b70" containerName="pull" Apr 24 21:34:48.073347 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.073198 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e095c361-1c49-404e-a7a9-8f81dc7d2b70" containerName="extract" Apr 24 21:34:48.073347 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.073255 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e095c361-1c49-404e-a7a9-8f81dc7d2b70" containerName="extract" Apr 24 21:34:48.073550 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.073532 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e095c361-1c49-404e-a7a9-8f81dc7d2b70" containerName="extract" Apr 24 21:34:48.080403 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.080382 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.083419 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.083401 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:34:48.083701 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.083685 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:34:48.084510 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.084492 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-58qm5\"" Apr 24 21:34:48.089166 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.089146 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh"] Apr 24 21:34:48.255884 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.255847 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.256035 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.255897 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wj54\" (UniqueName: \"kubernetes.io/projected/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-kube-api-access-4wj54\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.256074 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.256039 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.357048 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.356962 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wj54\" (UniqueName: \"kubernetes.io/projected/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-kube-api-access-4wj54\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.357048 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.357038 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.357262 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.357058 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.357376 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.357358 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.357445 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.357429 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.418534 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.418505 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wj54\" (UniqueName: \"kubernetes.io/projected/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-kube-api-access-4wj54\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.689536 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.689498 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:48.830949 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:48.830923 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh"] Apr 24 21:34:48.833358 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:34:48.833323 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88f060a1_1d44_4ab4_8f6e_60d1ac8cffbe.slice/crio-3016e11b29c2fa14f8ebfd083474ae8fce347baeca23fae494bc43c562c6c80f WatchSource:0}: Error finding container 3016e11b29c2fa14f8ebfd083474ae8fce347baeca23fae494bc43c562c6c80f: Status 404 returned error can't find the container with id 3016e11b29c2fa14f8ebfd083474ae8fce347baeca23fae494bc43c562c6c80f Apr 24 21:34:49.732615 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.732504 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm"] Apr 24 21:34:49.735331 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.735315 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" Apr 24 21:34:49.738723 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.738702 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 24 21:34:49.738844 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.738703 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-zzcv6\"" Apr 24 21:34:49.739085 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.739068 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 24 21:34:49.767407 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.767383 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm"] Apr 24 21:34:49.768961 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.768943 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/69f4d808-e43d-4f0f-b323-be152f4ef853-operator-config\") pod \"servicemesh-operator3-55f49c5f94-wpxcm\" (UID: \"69f4d808-e43d-4f0f-b323-be152f4ef853\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" Apr 24 21:34:49.769033 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.769004 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7d47\" (UniqueName: \"kubernetes.io/projected/69f4d808-e43d-4f0f-b323-be152f4ef853-kube-api-access-v7d47\") pod \"servicemesh-operator3-55f49c5f94-wpxcm\" (UID: \"69f4d808-e43d-4f0f-b323-be152f4ef853\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" Apr 24 21:34:49.815005 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.814966 2567 generic.go:358] "Generic (PLEG): container finished" podID="88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" containerID="0d83fb26446029882e6d068b6e32e5a9355063e38bc69efd3b3f56aaad4685d1" exitCode=0 Apr 24 21:34:49.815203 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.815018 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" event={"ID":"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe","Type":"ContainerDied","Data":"0d83fb26446029882e6d068b6e32e5a9355063e38bc69efd3b3f56aaad4685d1"} Apr 24 21:34:49.815203 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.815042 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" event={"ID":"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe","Type":"ContainerStarted","Data":"3016e11b29c2fa14f8ebfd083474ae8fce347baeca23fae494bc43c562c6c80f"} Apr 24 21:34:49.870232 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.870198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7d47\" (UniqueName: \"kubernetes.io/projected/69f4d808-e43d-4f0f-b323-be152f4ef853-kube-api-access-v7d47\") pod \"servicemesh-operator3-55f49c5f94-wpxcm\" (UID: \"69f4d808-e43d-4f0f-b323-be152f4ef853\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" Apr 24 21:34:49.870404 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.870245 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/69f4d808-e43d-4f0f-b323-be152f4ef853-operator-config\") pod \"servicemesh-operator3-55f49c5f94-wpxcm\" (UID: \"69f4d808-e43d-4f0f-b323-be152f4ef853\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" Apr 24 21:34:49.872676 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.872643 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/69f4d808-e43d-4f0f-b323-be152f4ef853-operator-config\") pod \"servicemesh-operator3-55f49c5f94-wpxcm\" (UID: \"69f4d808-e43d-4f0f-b323-be152f4ef853\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" Apr 24 21:34:49.886701 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:49.886674 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7d47\" (UniqueName: \"kubernetes.io/projected/69f4d808-e43d-4f0f-b323-be152f4ef853-kube-api-access-v7d47\") pod \"servicemesh-operator3-55f49c5f94-wpxcm\" (UID: \"69f4d808-e43d-4f0f-b323-be152f4ef853\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" Apr 24 21:34:50.044088 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:50.044001 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" Apr 24 21:34:50.172344 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:50.172317 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm"] Apr 24 21:34:50.174808 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:34:50.174774 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69f4d808_e43d_4f0f_b323_be152f4ef853.slice/crio-c0590b33ca446070e12ab6f576cc965b65726143a7723f6060463bccc154b3e1 WatchSource:0}: Error finding container c0590b33ca446070e12ab6f576cc965b65726143a7723f6060463bccc154b3e1: Status 404 returned error can't find the container with id c0590b33ca446070e12ab6f576cc965b65726143a7723f6060463bccc154b3e1 Apr 24 21:34:50.819138 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:50.819048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" event={"ID":"69f4d808-e43d-4f0f-b323-be152f4ef853","Type":"ContainerStarted","Data":"c0590b33ca446070e12ab6f576cc965b65726143a7723f6060463bccc154b3e1"} Apr 24 21:34:50.820530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:50.820501 2567 generic.go:358] "Generic (PLEG): container finished" podID="88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" containerID="5f63b1527fadfd5a7878a10e70ab98b50ab94d123510f2c75671fcc9a3d18728" exitCode=0 Apr 24 21:34:50.820660 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:50.820559 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" event={"ID":"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe","Type":"ContainerDied","Data":"5f63b1527fadfd5a7878a10e70ab98b50ab94d123510f2c75671fcc9a3d18728"} Apr 24 21:34:51.827874 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:51.827836 2567 generic.go:358] "Generic (PLEG): container finished" podID="88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" containerID="dd7095f923f89128cba2130b73520291beb1952619cea352ed9569eee9cbd50c" exitCode=0 Apr 24 21:34:51.828317 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:51.827919 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" event={"ID":"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe","Type":"ContainerDied","Data":"dd7095f923f89128cba2130b73520291beb1952619cea352ed9569eee9cbd50c"} Apr 24 21:34:52.996698 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:52.996602 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:53.090620 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.090506 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wj54\" (UniqueName: \"kubernetes.io/projected/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-kube-api-access-4wj54\") pod \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " Apr 24 21:34:53.090620 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.090568 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-bundle\") pod \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " Apr 24 21:34:53.091503 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.091465 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-bundle" (OuterVolumeSpecName: "bundle") pod "88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" (UID: "88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:34:53.092649 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.092625 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-kube-api-access-4wj54" (OuterVolumeSpecName: "kube-api-access-4wj54") pod "88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" (UID: "88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe"). InnerVolumeSpecName "kube-api-access-4wj54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:53.191667 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.191628 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-util\") pod \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\" (UID: \"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe\") " Apr 24 21:34:53.191866 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.191829 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4wj54\" (UniqueName: \"kubernetes.io/projected/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-kube-api-access-4wj54\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:34:53.191866 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.191847 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:34:53.199718 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.199674 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-util" (OuterVolumeSpecName: "util") pod "88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" (UID: "88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:34:53.292277 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.292238 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe-util\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:34:53.836707 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.836671 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" event={"ID":"69f4d808-e43d-4f0f-b323-be152f4ef853","Type":"ContainerStarted","Data":"60060941cc1a452cb9d2d3a10a05bcc8df5e1e94f1e97227fb8aa11e8f7bae51"} Apr 24 21:34:53.836913 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.836740 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" Apr 24 21:34:53.838271 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.838246 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" Apr 24 21:34:53.838271 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.838254 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebv8rgh" event={"ID":"88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe","Type":"ContainerDied","Data":"3016e11b29c2fa14f8ebfd083474ae8fce347baeca23fae494bc43c562c6c80f"} Apr 24 21:34:53.838434 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.838275 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3016e11b29c2fa14f8ebfd083474ae8fce347baeca23fae494bc43c562c6c80f" Apr 24 21:34:53.858810 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:34:53.858748 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" podStartSLOduration=2.045445509 podStartE2EDuration="4.858730911s" podCreationTimestamp="2026-04-24 21:34:49 +0000 UTC" firstStartedPulling="2026-04-24 21:34:50.17721355 +0000 UTC m=+428.275626041" lastFinishedPulling="2026-04-24 21:34:52.990498948 +0000 UTC m=+431.088911443" observedRunningTime="2026-04-24 21:34:53.857126853 +0000 UTC m=+431.955539365" watchObservedRunningTime="2026-04-24 21:34:53.858730911 +0000 UTC m=+431.957143425" Apr 24 21:35:04.845172 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:04.845092 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wpxcm" Apr 24 21:35:07.894328 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:07.894299 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57dc596fcd-ql9q5"] Apr 24 21:35:07.894781 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:07.894603 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" containerName="pull" Apr 24 21:35:07.894781 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:07.894618 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" containerName="pull" Apr 24 21:35:07.894781 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:07.894630 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" containerName="extract" Apr 24 21:35:07.894781 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:07.894639 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" containerName="extract" Apr 24 21:35:07.894781 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:07.894649 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" containerName="util" Apr 24 21:35:07.894781 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:07.894657 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" containerName="util" Apr 24 21:35:07.894781 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:07.894735 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="88f060a1-1d44-4ab4-8f6e-60d1ac8cffbe" containerName="extract" Apr 24 21:35:07.897792 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:07.897775 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:07.932049 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:07.932022 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57dc596fcd-ql9q5"] Apr 24 21:35:08.004600 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.004542 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vppww\" (UniqueName: \"kubernetes.io/projected/74d79a05-452e-4127-b595-ba3ced488668-kube-api-access-vppww\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.004772 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.004607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d79a05-452e-4127-b595-ba3ced488668-console-serving-cert\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.004772 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.004634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-oauth-serving-cert\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.004772 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.004654 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74d79a05-452e-4127-b595-ba3ced488668-console-oauth-config\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.004772 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.004679 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-trusted-ca-bundle\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.004772 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.004757 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-service-ca\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.004950 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.004810 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-console-config\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.105690 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.105651 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-console-config\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.105861 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.105704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vppww\" (UniqueName: \"kubernetes.io/projected/74d79a05-452e-4127-b595-ba3ced488668-kube-api-access-vppww\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.105861 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.105737 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d79a05-452e-4127-b595-ba3ced488668-console-serving-cert\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.105861 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.105763 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-oauth-serving-cert\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.105861 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.105786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74d79a05-452e-4127-b595-ba3ced488668-console-oauth-config\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.106063 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.106012 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-trusted-ca-bundle\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.106110 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.106068 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-service-ca\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.106563 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.106537 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-console-config\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.106717 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.106657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-service-ca\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.106717 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.106704 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-oauth-serving-cert\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.107034 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.107015 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d79a05-452e-4127-b595-ba3ced488668-trusted-ca-bundle\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.108196 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.108170 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74d79a05-452e-4127-b595-ba3ced488668-console-oauth-config\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.108270 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.108257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d79a05-452e-4127-b595-ba3ced488668-console-serving-cert\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.116962 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.116944 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vppww\" (UniqueName: \"kubernetes.io/projected/74d79a05-452e-4127-b595-ba3ced488668-kube-api-access-vppww\") pod \"console-57dc596fcd-ql9q5\" (UID: \"74d79a05-452e-4127-b595-ba3ced488668\") " pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.207029 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.206999 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:08.336867 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.336725 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57dc596fcd-ql9q5"] Apr 24 21:35:08.339408 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:35:08.339382 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d79a05_452e_4127_b595_ba3ced488668.slice/crio-231ce8954aaac51c18d37cb8ae0de378d138e51947681b80b43556b1b4f70330 WatchSource:0}: Error finding container 231ce8954aaac51c18d37cb8ae0de378d138e51947681b80b43556b1b4f70330: Status 404 returned error can't find the container with id 231ce8954aaac51c18d37cb8ae0de378d138e51947681b80b43556b1b4f70330 Apr 24 21:35:08.888769 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.888733 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57dc596fcd-ql9q5" event={"ID":"74d79a05-452e-4127-b595-ba3ced488668","Type":"ContainerStarted","Data":"cd4520ee5e5543c3eef3d1ee8f2c5760c46a7c726ee685f6b6f12397c4f85211"} Apr 24 21:35:08.888769 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.888772 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57dc596fcd-ql9q5" event={"ID":"74d79a05-452e-4127-b595-ba3ced488668","Type":"ContainerStarted","Data":"231ce8954aaac51c18d37cb8ae0de378d138e51947681b80b43556b1b4f70330"} Apr 24 21:35:08.922506 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:08.922462 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57dc596fcd-ql9q5" podStartSLOduration=1.922449316 podStartE2EDuration="1.922449316s" podCreationTimestamp="2026-04-24 21:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:35:08.921250343 +0000 UTC m=+447.019662855" watchObservedRunningTime="2026-04-24 21:35:08.922449316 +0000 UTC m=+447.020861828" Apr 24 21:35:18.207964 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:18.207925 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:18.207964 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:18.207974 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:18.212801 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:18.212778 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:18.925704 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:18.925673 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57dc596fcd-ql9q5" Apr 24 21:35:18.987009 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:18.986970 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-854bc64795-4jdkf"] Apr 24 21:35:36.604726 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.604697 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg"] Apr 24 21:35:36.610657 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.608779 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.611859 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.611832 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 24 21:35:36.611984 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.611833 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 24 21:35:36.611984 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.611962 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-4cnnm\"" Apr 24 21:35:36.612200 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.612163 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 24 21:35:36.612200 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.612171 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:35:36.612419 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.612396 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 24 21:35:36.612493 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.612398 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:35:36.618825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.618806 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg"] Apr 24 21:35:36.730728 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.730694 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.730906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.730739 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.730906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.730793 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.730906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.730837 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.730906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.730870 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.730906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.730887 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2497\" (UniqueName: \"kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-kube-api-access-x2497\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.731088 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.730946 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.831749 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.831710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.831749 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.831754 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.831976 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.831783 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.831976 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.831810 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2497\" (UniqueName: \"kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-kube-api-access-x2497\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.831976 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.831848 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.831976 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.831889 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.832204 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.831971 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.832775 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.832744 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.834304 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.834273 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.834407 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.834336 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.834684 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.834664 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.834729 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.834694 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.841890 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.841846 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2497\" (UniqueName: \"kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-kube-api-access-x2497\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.842454 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.842416 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-qkhkg\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:36.920575 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:36.920534 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:37.257199 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:37.257172 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg"] Apr 24 21:35:37.260105 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:35:37.260074 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89c3bdd4_d367_45d6_bbd4_9dd1af4ed827.slice/crio-f4c54daa7afeb081c5ee6c9df8ae52f1ce87f6f9fa3ab404234a134d452fbe1c WatchSource:0}: Error finding container f4c54daa7afeb081c5ee6c9df8ae52f1ce87f6f9fa3ab404234a134d452fbe1c: Status 404 returned error can't find the container with id f4c54daa7afeb081c5ee6c9df8ae52f1ce87f6f9fa3ab404234a134d452fbe1c Apr 24 21:35:37.987207 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:37.987163 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" event={"ID":"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827","Type":"ContainerStarted","Data":"f4c54daa7afeb081c5ee6c9df8ae52f1ce87f6f9fa3ab404234a134d452fbe1c"} Apr 24 21:35:39.801166 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:39.801130 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 21:35:39.801502 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:39.801195 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 21:35:39.996759 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:39.996723 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" event={"ID":"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827","Type":"ContainerStarted","Data":"e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96"} Apr 24 21:35:39.996932 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:39.996782 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:40.021164 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:40.021108 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" podStartSLOduration=1.48261782 podStartE2EDuration="4.021091489s" podCreationTimestamp="2026-04-24 21:35:36 +0000 UTC" firstStartedPulling="2026-04-24 21:35:37.262433302 +0000 UTC m=+475.360845793" lastFinishedPulling="2026-04-24 21:35:39.800906968 +0000 UTC m=+477.899319462" observedRunningTime="2026-04-24 21:35:40.018987198 +0000 UTC m=+478.117399711" watchObservedRunningTime="2026-04-24 21:35:40.021091489 +0000 UTC m=+478.119504001" Apr 24 21:35:41.002694 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:41.002647 2567 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-qkhkg container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 24 21:35:41.003069 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:41.002707 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" podUID="89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:35:44.001950 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.001922 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:35:44.006274 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.006245 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-854bc64795-4jdkf" podUID="b0d3d358-1cc0-4830-baca-df4d99001501" containerName="console" containerID="cri-o://f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87" gracePeriod=15 Apr 24 21:35:44.243501 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.243477 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-854bc64795-4jdkf_b0d3d358-1cc0-4830-baca-df4d99001501/console/0.log" Apr 24 21:35:44.243652 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.243539 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:35:44.381906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.379077 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d"] Apr 24 21:35:44.381906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.379898 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0d3d358-1cc0-4830-baca-df4d99001501" containerName="console" Apr 24 21:35:44.381906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.379919 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d3d358-1cc0-4830-baca-df4d99001501" containerName="console" Apr 24 21:35:44.381906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.380119 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0d3d358-1cc0-4830-baca-df4d99001501" containerName="console" Apr 24 21:35:44.385555 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.385533 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.390208 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.390185 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-2kt25\"" Apr 24 21:35:44.402165 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402147 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-trusted-ca-bundle\") pod \"b0d3d358-1cc0-4830-baca-df4d99001501\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " Apr 24 21:35:44.402250 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402176 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdszh\" (UniqueName: \"kubernetes.io/projected/b0d3d358-1cc0-4830-baca-df4d99001501-kube-api-access-cdszh\") pod \"b0d3d358-1cc0-4830-baca-df4d99001501\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " Apr 24 21:35:44.402250 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402194 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-console-config\") pod \"b0d3d358-1cc0-4830-baca-df4d99001501\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " Apr 24 21:35:44.402250 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402227 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-oauth-config\") pod \"b0d3d358-1cc0-4830-baca-df4d99001501\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " Apr 24 21:35:44.402385 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402281 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-oauth-serving-cert\") pod \"b0d3d358-1cc0-4830-baca-df4d99001501\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " Apr 24 21:35:44.402385 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402334 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-serving-cert\") pod \"b0d3d358-1cc0-4830-baca-df4d99001501\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " Apr 24 21:35:44.402385 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402376 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-service-ca\") pod \"b0d3d358-1cc0-4830-baca-df4d99001501\" (UID: \"b0d3d358-1cc0-4830-baca-df4d99001501\") " Apr 24 21:35:44.402671 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402647 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-console-config" (OuterVolumeSpecName: "console-config") pod "b0d3d358-1cc0-4830-baca-df4d99001501" (UID: "b0d3d358-1cc0-4830-baca-df4d99001501"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:44.402864 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402673 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b0d3d358-1cc0-4830-baca-df4d99001501" (UID: "b0d3d358-1cc0-4830-baca-df4d99001501"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:44.402864 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402813 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b0d3d358-1cc0-4830-baca-df4d99001501" (UID: "b0d3d358-1cc0-4830-baca-df4d99001501"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:44.402978 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.402931 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-service-ca" (OuterVolumeSpecName: "service-ca") pod "b0d3d358-1cc0-4830-baca-df4d99001501" (UID: "b0d3d358-1cc0-4830-baca-df4d99001501"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:44.404663 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.404639 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b0d3d358-1cc0-4830-baca-df4d99001501" (UID: "b0d3d358-1cc0-4830-baca-df4d99001501"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:44.404807 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.404757 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b0d3d358-1cc0-4830-baca-df4d99001501" (UID: "b0d3d358-1cc0-4830-baca-df4d99001501"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:44.404855 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.404823 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d3d358-1cc0-4830-baca-df4d99001501-kube-api-access-cdszh" (OuterVolumeSpecName: "kube-api-access-cdszh") pod "b0d3d358-1cc0-4830-baca-df4d99001501" (UID: "b0d3d358-1cc0-4830-baca-df4d99001501"). InnerVolumeSpecName "kube-api-access-cdszh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:44.418150 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.418127 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d"] Apr 24 21:35:44.503306 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503269 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.503306 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503306 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.503538 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503334 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.503538 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503375 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.503538 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.503538 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503419 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb8lc\" (UniqueName: \"kubernetes.io/projected/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-kube-api-access-hb8lc\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.503538 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.503825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503570 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.503825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503638 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.503825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503716 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-oauth-config\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:44.503825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503734 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-oauth-serving-cert\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:44.503825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503751 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d3d358-1cc0-4830-baca-df4d99001501-console-serving-cert\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:44.503825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503768 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-service-ca\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:44.503825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503782 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-trusted-ca-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:44.503825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503797 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cdszh\" (UniqueName: \"kubernetes.io/projected/b0d3d358-1cc0-4830-baca-df4d99001501-kube-api-access-cdszh\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:44.503825 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.503811 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0d3d358-1cc0-4830-baca-df4d99001501-console-config\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:44.604771 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.604736 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.604771 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.604771 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605016 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.604803 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605016 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.604922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605016 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.604966 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605016 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.605010 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605215 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.605034 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605215 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.605061 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb8lc\" (UniqueName: \"kubernetes.io/projected/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-kube-api-access-hb8lc\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605215 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.605134 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605361 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.605210 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605361 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.605276 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605509 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.605485 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605601 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.605560 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.605755 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.605732 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.607183 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.607164 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.607452 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.607434 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.613714 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.613689 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb8lc\" (UniqueName: \"kubernetes.io/projected/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-kube-api-access-hb8lc\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.613867 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.613849 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2dd33b44-6aab-42e9-bdcf-ee18107e7f04-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d\" (UID: \"2dd33b44-6aab-42e9-bdcf-ee18107e7f04\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.697109 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.697069 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:44.832522 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:44.832494 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d"] Apr 24 21:35:44.835053 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:35:44.835025 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd33b44_6aab_42e9_bdcf_ee18107e7f04.slice/crio-e08b56b80b0c76f86bf2a19dca756ddb656dba279918160880a31203c0eebd57 WatchSource:0}: Error finding container e08b56b80b0c76f86bf2a19dca756ddb656dba279918160880a31203c0eebd57: Status 404 returned error can't find the container with id e08b56b80b0c76f86bf2a19dca756ddb656dba279918160880a31203c0eebd57 Apr 24 21:35:45.017742 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.017645 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" event={"ID":"2dd33b44-6aab-42e9-bdcf-ee18107e7f04","Type":"ContainerStarted","Data":"e08b56b80b0c76f86bf2a19dca756ddb656dba279918160880a31203c0eebd57"} Apr 24 21:35:45.019014 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.018989 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-854bc64795-4jdkf_b0d3d358-1cc0-4830-baca-df4d99001501/console/0.log" Apr 24 21:35:45.019144 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.019037 2567 generic.go:358] "Generic (PLEG): container finished" podID="b0d3d358-1cc0-4830-baca-df4d99001501" containerID="f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87" exitCode=2 Apr 24 21:35:45.019144 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.019094 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-854bc64795-4jdkf" event={"ID":"b0d3d358-1cc0-4830-baca-df4d99001501","Type":"ContainerDied","Data":"f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87"} Apr 24 21:35:45.019144 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.019117 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-854bc64795-4jdkf" event={"ID":"b0d3d358-1cc0-4830-baca-df4d99001501","Type":"ContainerDied","Data":"57ccfd22a49c7a95b4eabefca8a2da7152b76f704a5409b76a4389d2a8e33148"} Apr 24 21:35:45.019144 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.019136 2567 scope.go:117] "RemoveContainer" containerID="f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87" Apr 24 21:35:45.019367 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.019138 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-854bc64795-4jdkf" Apr 24 21:35:45.028684 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.028660 2567 scope.go:117] "RemoveContainer" containerID="f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87" Apr 24 21:35:45.028921 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:35:45.028903 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87\": container with ID starting with f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87 not found: ID does not exist" containerID="f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87" Apr 24 21:35:45.028981 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.028928 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87"} err="failed to get container status \"f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87\": rpc error: code = NotFound desc = could not find container \"f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87\": container with ID starting with f6d3adeea6b5159506cdf6846fe1b3ffcb50dc57724bbcc8d945207a1f7baa87 not found: ID does not exist" Apr 24 21:35:45.043175 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.043150 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-854bc64795-4jdkf"] Apr 24 21:35:45.051050 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.051030 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-854bc64795-4jdkf"] Apr 24 21:35:45.224563 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.224531 2567 patch_prober.go:28] interesting pod/console-854bc64795-4jdkf container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.19:8443/health\": context deadline exceeded" start-of-body= Apr 24 21:35:45.224744 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:45.224601 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-854bc64795-4jdkf" podUID="b0d3d358-1cc0-4830-baca-df4d99001501" containerName="console" probeResult="failure" output="Get \"https://10.133.0.19:8443/health\": context deadline exceeded" Apr 24 21:35:46.420172 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:46.420133 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d3d358-1cc0-4830-baca-df4d99001501" path="/var/lib/kubelet/pods/b0d3d358-1cc0-4830-baca-df4d99001501/volumes" Apr 24 21:35:47.541759 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:47.541721 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 21:35:47.542005 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:47.541816 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 21:35:47.542005 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:47.541853 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 21:35:48.034034 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:48.033998 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" event={"ID":"2dd33b44-6aab-42e9-bdcf-ee18107e7f04","Type":"ContainerStarted","Data":"2fcd9a06bd8f0682efe392e894faa5869691419384439b7276561347175b1054"} Apr 24 21:35:48.064303 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:48.064247 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" podStartSLOduration=1.359815701 podStartE2EDuration="4.064230632s" podCreationTimestamp="2026-04-24 21:35:44 +0000 UTC" firstStartedPulling="2026-04-24 21:35:44.837053782 +0000 UTC m=+482.935466274" lastFinishedPulling="2026-04-24 21:35:47.541468694 +0000 UTC m=+485.639881205" observedRunningTime="2026-04-24 21:35:48.060523456 +0000 UTC m=+486.158935969" watchObservedRunningTime="2026-04-24 21:35:48.064230632 +0000 UTC m=+486.162643146" Apr 24 21:35:48.698186 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:48.698156 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:48.699613 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:48.699571 2567 patch_prober.go:28] interesting pod/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.34:15021/healthz/ready\": dial tcp 10.133.0.34:15021: connect: connection refused" start-of-body= Apr 24 21:35:48.699734 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:48.699634 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" podUID="2dd33b44-6aab-42e9-bdcf-ee18107e7f04" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.34:15021/healthz/ready\": dial tcp 10.133.0.34:15021: connect: connection refused" Apr 24 21:35:49.700965 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:49.700935 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:50.040538 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:50.040458 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:50.041426 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:50.041411 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d" Apr 24 21:35:52.294334 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.294297 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff"] Apr 24 21:35:52.298099 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.298076 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.300279 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.300256 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-58qm5\"" Apr 24 21:35:52.300396 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.300298 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:35:52.301058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.301040 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:35:52.305601 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.305567 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff"] Apr 24 21:35:52.375129 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.375096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.375288 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.375165 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7jxs\" (UniqueName: \"kubernetes.io/projected/3db84943-4ae7-431d-8fce-14be8a082e9e-kube-api-access-g7jxs\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.375288 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.375249 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.393911 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.393878 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5"] Apr 24 21:35:52.397291 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.397273 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.404982 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.404962 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5"] Apr 24 21:35:52.475787 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.475753 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.475787 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.475791 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.476018 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.475811 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.476018 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.475834 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wrps\" (UniqueName: \"kubernetes.io/projected/22efcd6c-cc94-448c-9b22-7617f762b5bf-kube-api-access-8wrps\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.476018 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.475853 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7jxs\" (UniqueName: \"kubernetes.io/projected/3db84943-4ae7-431d-8fce-14be8a082e9e-kube-api-access-g7jxs\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.476018 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.475892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.476385 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.476225 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.476385 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.476361 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.488925 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.488900 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7jxs\" (UniqueName: \"kubernetes.io/projected/3db84943-4ae7-431d-8fce-14be8a082e9e-kube-api-access-g7jxs\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.495134 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.495111 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z"] Apr 24 21:35:52.499099 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.499079 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.506680 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.506650 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z"] Apr 24 21:35:52.576985 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.576905 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.576985 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.576946 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wrps\" (UniqueName: \"kubernetes.io/projected/22efcd6c-cc94-448c-9b22-7617f762b5bf-kube-api-access-8wrps\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.576985 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.576977 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.577218 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.577009 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.577218 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.577053 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.577218 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.577128 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8wd7\" (UniqueName: \"kubernetes.io/projected/208b118a-a5a4-47fc-bfae-6f2c53aaf505-kube-api-access-h8wd7\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.577357 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.577306 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.577404 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.577386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.589358 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.589333 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wrps\" (UniqueName: \"kubernetes.io/projected/22efcd6c-cc94-448c-9b22-7617f762b5bf-kube-api-access-8wrps\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.594708 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.594682 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w"] Apr 24 21:35:52.598356 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.598340 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.606468 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.606444 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w"] Apr 24 21:35:52.607464 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.607448 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:52.677860 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.677816 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h94g\" (UniqueName: \"kubernetes.io/projected/a4f9d047-270a-45eb-91b0-aa724c135e67-kube-api-access-7h94g\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.678005 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.677878 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.678005 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.677914 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.678005 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.677980 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.678135 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.678034 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.678135 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.678090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8wd7\" (UniqueName: \"kubernetes.io/projected/208b118a-a5a4-47fc-bfae-6f2c53aaf505-kube-api-access-h8wd7\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.678368 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.678341 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.678368 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.678358 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.686871 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.686827 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8wd7\" (UniqueName: \"kubernetes.io/projected/208b118a-a5a4-47fc-bfae-6f2c53aaf505-kube-api-access-h8wd7\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.707378 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.707348 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:52.779013 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.778983 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.779148 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.779102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7h94g\" (UniqueName: \"kubernetes.io/projected/a4f9d047-270a-45eb-91b0-aa724c135e67-kube-api-access-7h94g\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.779199 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.779144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.779550 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.779449 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.779550 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.779490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.788375 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.788347 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h94g\" (UniqueName: \"kubernetes.io/projected/a4f9d047-270a-45eb-91b0-aa724c135e67-kube-api-access-7h94g\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.809031 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.809003 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:52.825689 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.825665 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5"] Apr 24 21:35:52.828295 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:35:52.828230 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22efcd6c_cc94_448c_9b22_7617f762b5bf.slice/crio-f9c688cd3444117be572e1aa82c74467a222cf4e7298c31b9a082a6a14c363c8 WatchSource:0}: Error finding container f9c688cd3444117be572e1aa82c74467a222cf4e7298c31b9a082a6a14c363c8: Status 404 returned error can't find the container with id f9c688cd3444117be572e1aa82c74467a222cf4e7298c31b9a082a6a14c363c8 Apr 24 21:35:52.908393 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.908369 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:52.936166 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.936137 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z"] Apr 24 21:35:52.939030 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:35:52.938993 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod208b118a_a5a4_47fc_bfae_6f2c53aaf505.slice/crio-bac80fab82ef3bf62615fa42b105778ca8ed902499a9e76af3ef7456965cbca8 WatchSource:0}: Error finding container bac80fab82ef3bf62615fa42b105778ca8ed902499a9e76af3ef7456965cbca8: Status 404 returned error can't find the container with id bac80fab82ef3bf62615fa42b105778ca8ed902499a9e76af3ef7456965cbca8 Apr 24 21:35:52.940776 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:52.940752 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff"] Apr 24 21:35:52.942386 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:35:52.942349 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db84943_4ae7_431d_8fce_14be8a082e9e.slice/crio-80ffde35100dee30804d6f41eb406f49dc22393d21a2464b7d2e910fda285ded WatchSource:0}: Error finding container 80ffde35100dee30804d6f41eb406f49dc22393d21a2464b7d2e910fda285ded: Status 404 returned error can't find the container with id 80ffde35100dee30804d6f41eb406f49dc22393d21a2464b7d2e910fda285ded Apr 24 21:35:53.040250 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:53.040220 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w"] Apr 24 21:35:53.041823 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:35:53.041790 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4f9d047_270a_45eb_91b0_aa724c135e67.slice/crio-d3b18a230dff179cefde238aa2d5d4a4e89181060979e1bfe2afefbf2b75ee9b WatchSource:0}: Error finding container d3b18a230dff179cefde238aa2d5d4a4e89181060979e1bfe2afefbf2b75ee9b: Status 404 returned error can't find the container with id d3b18a230dff179cefde238aa2d5d4a4e89181060979e1bfe2afefbf2b75ee9b Apr 24 21:35:53.056422 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:53.056364 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" event={"ID":"3db84943-4ae7-431d-8fce-14be8a082e9e","Type":"ContainerStarted","Data":"0a8ecd7c35ee257e6db1ddb7e72903cb640081493f764f521c688dfd7ee9ee68"} Apr 24 21:35:53.056422 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:53.056402 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" event={"ID":"3db84943-4ae7-431d-8fce-14be8a082e9e","Type":"ContainerStarted","Data":"80ffde35100dee30804d6f41eb406f49dc22393d21a2464b7d2e910fda285ded"} Apr 24 21:35:53.057550 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:53.057527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" event={"ID":"a4f9d047-270a-45eb-91b0-aa724c135e67","Type":"ContainerStarted","Data":"d3b18a230dff179cefde238aa2d5d4a4e89181060979e1bfe2afefbf2b75ee9b"} Apr 24 21:35:53.058921 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:53.058897 2567 generic.go:358] "Generic (PLEG): container finished" podID="22efcd6c-cc94-448c-9b22-7617f762b5bf" containerID="3ba85392815ad16ebd1c9b9a5b6d2ce084ec736257dab6206d889764c873bfaa" exitCode=0 Apr 24 21:35:53.058998 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:53.058944 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" event={"ID":"22efcd6c-cc94-448c-9b22-7617f762b5bf","Type":"ContainerDied","Data":"3ba85392815ad16ebd1c9b9a5b6d2ce084ec736257dab6206d889764c873bfaa"} Apr 24 21:35:53.058998 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:53.058978 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" event={"ID":"22efcd6c-cc94-448c-9b22-7617f762b5bf","Type":"ContainerStarted","Data":"f9c688cd3444117be572e1aa82c74467a222cf4e7298c31b9a082a6a14c363c8"} Apr 24 21:35:53.060806 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:53.060740 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" event={"ID":"208b118a-a5a4-47fc-bfae-6f2c53aaf505","Type":"ContainerStarted","Data":"2a5704113f1848fc17a2ad769bbef6fdcf2211089eac3df71f69922a4aea34ab"} Apr 24 21:35:53.060806 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:53.060766 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" event={"ID":"208b118a-a5a4-47fc-bfae-6f2c53aaf505","Type":"ContainerStarted","Data":"bac80fab82ef3bf62615fa42b105778ca8ed902499a9e76af3ef7456965cbca8"} Apr 24 21:35:54.065690 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:54.065576 2567 generic.go:358] "Generic (PLEG): container finished" podID="3db84943-4ae7-431d-8fce-14be8a082e9e" containerID="0a8ecd7c35ee257e6db1ddb7e72903cb640081493f764f521c688dfd7ee9ee68" exitCode=0 Apr 24 21:35:54.066089 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:54.065682 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" event={"ID":"3db84943-4ae7-431d-8fce-14be8a082e9e","Type":"ContainerDied","Data":"0a8ecd7c35ee257e6db1ddb7e72903cb640081493f764f521c688dfd7ee9ee68"} Apr 24 21:35:54.067124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:54.067092 2567 generic.go:358] "Generic (PLEG): container finished" podID="a4f9d047-270a-45eb-91b0-aa724c135e67" containerID="eb8b0ceb81a8cb841f9712870b321f8fa393ca6f67e4a26dc1fa4696cd31d798" exitCode=0 Apr 24 21:35:54.067194 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:54.067161 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" event={"ID":"a4f9d047-270a-45eb-91b0-aa724c135e67","Type":"ContainerDied","Data":"eb8b0ceb81a8cb841f9712870b321f8fa393ca6f67e4a26dc1fa4696cd31d798"} Apr 24 21:35:54.068837 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:54.068812 2567 generic.go:358] "Generic (PLEG): container finished" podID="22efcd6c-cc94-448c-9b22-7617f762b5bf" containerID="9f1d009297b200e3518d574216ca23dd94e5af4efe8b3ce83725ad45adaaca7f" exitCode=0 Apr 24 21:35:54.068962 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:54.068896 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" event={"ID":"22efcd6c-cc94-448c-9b22-7617f762b5bf","Type":"ContainerDied","Data":"9f1d009297b200e3518d574216ca23dd94e5af4efe8b3ce83725ad45adaaca7f"} Apr 24 21:35:54.070374 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:54.070350 2567 generic.go:358] "Generic (PLEG): container finished" podID="208b118a-a5a4-47fc-bfae-6f2c53aaf505" containerID="2a5704113f1848fc17a2ad769bbef6fdcf2211089eac3df71f69922a4aea34ab" exitCode=0 Apr 24 21:35:54.070469 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:54.070377 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" event={"ID":"208b118a-a5a4-47fc-bfae-6f2c53aaf505","Type":"ContainerDied","Data":"2a5704113f1848fc17a2ad769bbef6fdcf2211089eac3df71f69922a4aea34ab"} Apr 24 21:35:55.075767 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:55.075725 2567 generic.go:358] "Generic (PLEG): container finished" podID="3db84943-4ae7-431d-8fce-14be8a082e9e" containerID="aadbe67c55239a88a4fbdd738195214712274997bb2ebaf649ddbcbe8586d755" exitCode=0 Apr 24 21:35:55.076192 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:55.075815 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" event={"ID":"3db84943-4ae7-431d-8fce-14be8a082e9e","Type":"ContainerDied","Data":"aadbe67c55239a88a4fbdd738195214712274997bb2ebaf649ddbcbe8586d755"} Apr 24 21:35:55.077414 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:55.077395 2567 generic.go:358] "Generic (PLEG): container finished" podID="a4f9d047-270a-45eb-91b0-aa724c135e67" containerID="3943cd68abb5bace9ed926b4bce1cf8140e9f34e51dcd37c47f05edc5df148b8" exitCode=0 Apr 24 21:35:55.077486 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:55.077468 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" event={"ID":"a4f9d047-270a-45eb-91b0-aa724c135e67","Type":"ContainerDied","Data":"3943cd68abb5bace9ed926b4bce1cf8140e9f34e51dcd37c47f05edc5df148b8"} Apr 24 21:35:55.079334 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:55.079318 2567 generic.go:358] "Generic (PLEG): container finished" podID="22efcd6c-cc94-448c-9b22-7617f762b5bf" containerID="04de209b583562f8088dc4d4b0d98d1231e23f2d05513349405b49e4df83cd47" exitCode=0 Apr 24 21:35:55.079422 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:55.079382 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" event={"ID":"22efcd6c-cc94-448c-9b22-7617f762b5bf","Type":"ContainerDied","Data":"04de209b583562f8088dc4d4b0d98d1231e23f2d05513349405b49e4df83cd47"} Apr 24 21:35:55.081099 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:55.081077 2567 generic.go:358] "Generic (PLEG): container finished" podID="208b118a-a5a4-47fc-bfae-6f2c53aaf505" containerID="6a30aa6b47175cecd07912ea9dd6fed947a307744e1f3f1e23e1acd2afbbc363" exitCode=0 Apr 24 21:35:55.081186 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:55.081133 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" event={"ID":"208b118a-a5a4-47fc-bfae-6f2c53aaf505","Type":"ContainerDied","Data":"6a30aa6b47175cecd07912ea9dd6fed947a307744e1f3f1e23e1acd2afbbc363"} Apr 24 21:35:56.086324 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.086292 2567 generic.go:358] "Generic (PLEG): container finished" podID="a4f9d047-270a-45eb-91b0-aa724c135e67" containerID="240a1cc8cf79e5ee8b17eb68d023286b95eac1f8374aa40ab48860d8b7dfb710" exitCode=0 Apr 24 21:35:56.086739 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.086372 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" event={"ID":"a4f9d047-270a-45eb-91b0-aa724c135e67","Type":"ContainerDied","Data":"240a1cc8cf79e5ee8b17eb68d023286b95eac1f8374aa40ab48860d8b7dfb710"} Apr 24 21:35:56.088241 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.088219 2567 generic.go:358] "Generic (PLEG): container finished" podID="208b118a-a5a4-47fc-bfae-6f2c53aaf505" containerID="b936dc587008482cef0d641b019af32cc4f2a93f1712f31c9d81b119c7f111b1" exitCode=0 Apr 24 21:35:56.088351 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.088298 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" event={"ID":"208b118a-a5a4-47fc-bfae-6f2c53aaf505","Type":"ContainerDied","Data":"b936dc587008482cef0d641b019af32cc4f2a93f1712f31c9d81b119c7f111b1"} Apr 24 21:35:56.089982 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.089960 2567 generic.go:358] "Generic (PLEG): container finished" podID="3db84943-4ae7-431d-8fce-14be8a082e9e" containerID="982790e83715ded1b2e73d6719f6e8baca1197de1554e9fb45c3c2bf1d8e58b8" exitCode=0 Apr 24 21:35:56.090135 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.090033 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" event={"ID":"3db84943-4ae7-431d-8fce-14be8a082e9e","Type":"ContainerDied","Data":"982790e83715ded1b2e73d6719f6e8baca1197de1554e9fb45c3c2bf1d8e58b8"} Apr 24 21:35:56.213039 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.213014 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:56.310219 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.310191 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wrps\" (UniqueName: \"kubernetes.io/projected/22efcd6c-cc94-448c-9b22-7617f762b5bf-kube-api-access-8wrps\") pod \"22efcd6c-cc94-448c-9b22-7617f762b5bf\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " Apr 24 21:35:56.310365 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.310240 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-util\") pod \"22efcd6c-cc94-448c-9b22-7617f762b5bf\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " Apr 24 21:35:56.310365 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.310259 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-bundle\") pod \"22efcd6c-cc94-448c-9b22-7617f762b5bf\" (UID: \"22efcd6c-cc94-448c-9b22-7617f762b5bf\") " Apr 24 21:35:56.310739 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.310712 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-bundle" (OuterVolumeSpecName: "bundle") pod "22efcd6c-cc94-448c-9b22-7617f762b5bf" (UID: "22efcd6c-cc94-448c-9b22-7617f762b5bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:56.312363 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.312338 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22efcd6c-cc94-448c-9b22-7617f762b5bf-kube-api-access-8wrps" (OuterVolumeSpecName: "kube-api-access-8wrps") pod "22efcd6c-cc94-448c-9b22-7617f762b5bf" (UID: "22efcd6c-cc94-448c-9b22-7617f762b5bf"). InnerVolumeSpecName "kube-api-access-8wrps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:56.315234 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.315213 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-util" (OuterVolumeSpecName: "util") pod "22efcd6c-cc94-448c-9b22-7617f762b5bf" (UID: "22efcd6c-cc94-448c-9b22-7617f762b5bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:56.410864 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.410837 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-util\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:56.410987 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.410885 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22efcd6c-cc94-448c-9b22-7617f762b5bf-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:56.410987 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:56.410895 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8wrps\" (UniqueName: \"kubernetes.io/projected/22efcd6c-cc94-448c-9b22-7617f762b5bf-kube-api-access-8wrps\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:57.095474 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.095415 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" Apr 24 21:35:57.095474 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.095447 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bjqft5" event={"ID":"22efcd6c-cc94-448c-9b22-7617f762b5bf","Type":"ContainerDied","Data":"f9c688cd3444117be572e1aa82c74467a222cf4e7298c31b9a082a6a14c363c8"} Apr 24 21:35:57.095474 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.095479 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9c688cd3444117be572e1aa82c74467a222cf4e7298c31b9a082a6a14c363c8" Apr 24 21:35:57.234767 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.234739 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:57.281187 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.281164 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:57.284219 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.284201 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:57.318276 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.318249 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-bundle\") pod \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " Apr 24 21:35:57.318416 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.318288 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-util\") pod \"a4f9d047-270a-45eb-91b0-aa724c135e67\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " Apr 24 21:35:57.318416 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.318313 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-util\") pod \"3db84943-4ae7-431d-8fce-14be8a082e9e\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " Apr 24 21:35:57.318416 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.318327 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-bundle\") pod \"a4f9d047-270a-45eb-91b0-aa724c135e67\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " Apr 24 21:35:57.318416 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.318377 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-bundle\") pod \"3db84943-4ae7-431d-8fce-14be8a082e9e\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " Apr 24 21:35:57.318416 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.318395 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-util\") pod \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " Apr 24 21:35:57.318684 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.318427 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h94g\" (UniqueName: \"kubernetes.io/projected/a4f9d047-270a-45eb-91b0-aa724c135e67-kube-api-access-7h94g\") pod \"a4f9d047-270a-45eb-91b0-aa724c135e67\" (UID: \"a4f9d047-270a-45eb-91b0-aa724c135e67\") " Apr 24 21:35:57.318684 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.318453 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7jxs\" (UniqueName: \"kubernetes.io/projected/3db84943-4ae7-431d-8fce-14be8a082e9e-kube-api-access-g7jxs\") pod \"3db84943-4ae7-431d-8fce-14be8a082e9e\" (UID: \"3db84943-4ae7-431d-8fce-14be8a082e9e\") " Apr 24 21:35:57.318684 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.318505 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8wd7\" (UniqueName: \"kubernetes.io/projected/208b118a-a5a4-47fc-bfae-6f2c53aaf505-kube-api-access-h8wd7\") pod \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\" (UID: \"208b118a-a5a4-47fc-bfae-6f2c53aaf505\") " Apr 24 21:35:57.319396 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.319363 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-bundle" (OuterVolumeSpecName: "bundle") pod "208b118a-a5a4-47fc-bfae-6f2c53aaf505" (UID: "208b118a-a5a4-47fc-bfae-6f2c53aaf505"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:57.319731 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.319683 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-bundle" (OuterVolumeSpecName: "bundle") pod "3db84943-4ae7-431d-8fce-14be8a082e9e" (UID: "3db84943-4ae7-431d-8fce-14be8a082e9e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:57.320996 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.320814 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-bundle" (OuterVolumeSpecName: "bundle") pod "a4f9d047-270a-45eb-91b0-aa724c135e67" (UID: "a4f9d047-270a-45eb-91b0-aa724c135e67"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:57.327338 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.327311 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-util" (OuterVolumeSpecName: "util") pod "a4f9d047-270a-45eb-91b0-aa724c135e67" (UID: "a4f9d047-270a-45eb-91b0-aa724c135e67"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:57.327882 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.327745 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-util" (OuterVolumeSpecName: "util") pod "3db84943-4ae7-431d-8fce-14be8a082e9e" (UID: "3db84943-4ae7-431d-8fce-14be8a082e9e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:57.327882 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.327822 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db84943-4ae7-431d-8fce-14be8a082e9e-kube-api-access-g7jxs" (OuterVolumeSpecName: "kube-api-access-g7jxs") pod "3db84943-4ae7-431d-8fce-14be8a082e9e" (UID: "3db84943-4ae7-431d-8fce-14be8a082e9e"). InnerVolumeSpecName "kube-api-access-g7jxs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:57.327882 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.327836 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/208b118a-a5a4-47fc-bfae-6f2c53aaf505-kube-api-access-h8wd7" (OuterVolumeSpecName: "kube-api-access-h8wd7") pod "208b118a-a5a4-47fc-bfae-6f2c53aaf505" (UID: "208b118a-a5a4-47fc-bfae-6f2c53aaf505"). InnerVolumeSpecName "kube-api-access-h8wd7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:57.327882 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.327851 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f9d047-270a-45eb-91b0-aa724c135e67-kube-api-access-7h94g" (OuterVolumeSpecName: "kube-api-access-7h94g") pod "a4f9d047-270a-45eb-91b0-aa724c135e67" (UID: "a4f9d047-270a-45eb-91b0-aa724c135e67"). InnerVolumeSpecName "kube-api-access-7h94g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:57.331196 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.331171 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-util" (OuterVolumeSpecName: "util") pod "208b118a-a5a4-47fc-bfae-6f2c53aaf505" (UID: "208b118a-a5a4-47fc-bfae-6f2c53aaf505"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:57.419557 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.419522 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:57.419557 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.419554 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-util\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:57.419766 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.419565 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7h94g\" (UniqueName: \"kubernetes.io/projected/a4f9d047-270a-45eb-91b0-aa724c135e67-kube-api-access-7h94g\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:57.419766 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.419602 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g7jxs\" (UniqueName: \"kubernetes.io/projected/3db84943-4ae7-431d-8fce-14be8a082e9e-kube-api-access-g7jxs\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:57.419766 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.419611 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8wd7\" (UniqueName: \"kubernetes.io/projected/208b118a-a5a4-47fc-bfae-6f2c53aaf505-kube-api-access-h8wd7\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:57.419766 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.419621 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/208b118a-a5a4-47fc-bfae-6f2c53aaf505-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:57.419766 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.419629 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-util\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:57.419766 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.419636 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db84943-4ae7-431d-8fce-14be8a082e9e-util\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:57.419766 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:57.419645 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4f9d047-270a-45eb-91b0-aa724c135e67-bundle\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:35:58.100809 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:58.100778 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" Apr 24 21:35:58.101205 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:58.100782 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503gfhff" event={"ID":"3db84943-4ae7-431d-8fce-14be8a082e9e","Type":"ContainerDied","Data":"80ffde35100dee30804d6f41eb406f49dc22393d21a2464b7d2e910fda285ded"} Apr 24 21:35:58.101205 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:58.100885 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ffde35100dee30804d6f41eb406f49dc22393d21a2464b7d2e910fda285ded" Apr 24 21:35:58.102557 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:58.102533 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" event={"ID":"a4f9d047-270a-45eb-91b0-aa724c135e67","Type":"ContainerDied","Data":"d3b18a230dff179cefde238aa2d5d4a4e89181060979e1bfe2afefbf2b75ee9b"} Apr 24 21:35:58.102699 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:58.102562 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3b18a230dff179cefde238aa2d5d4a4e89181060979e1bfe2afefbf2b75ee9b" Apr 24 21:35:58.102699 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:58.102537 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88hwq6w" Apr 24 21:35:58.104291 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:58.104268 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" event={"ID":"208b118a-a5a4-47fc-bfae-6f2c53aaf505","Type":"ContainerDied","Data":"bac80fab82ef3bf62615fa42b105778ca8ed902499a9e76af3ef7456965cbca8"} Apr 24 21:35:58.104291 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:58.104290 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304wq4z" Apr 24 21:35:58.104470 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:35:58.104291 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bac80fab82ef3bf62615fa42b105778ca8ed902499a9e76af3ef7456965cbca8" Apr 24 21:36:03.476693 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.476661 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg"] Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.476955 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22efcd6c-cc94-448c-9b22-7617f762b5bf" containerName="util" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.476966 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="22efcd6c-cc94-448c-9b22-7617f762b5bf" containerName="util" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.476975 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4f9d047-270a-45eb-91b0-aa724c135e67" containerName="pull" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.476980 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f9d047-270a-45eb-91b0-aa724c135e67" containerName="pull" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.476987 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4f9d047-270a-45eb-91b0-aa724c135e67" containerName="util" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.476993 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f9d047-270a-45eb-91b0-aa724c135e67" containerName="util" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477005 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22efcd6c-cc94-448c-9b22-7617f762b5bf" containerName="pull" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477010 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="22efcd6c-cc94-448c-9b22-7617f762b5bf" containerName="pull" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477019 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22efcd6c-cc94-448c-9b22-7617f762b5bf" containerName="extract" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477024 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="22efcd6c-cc94-448c-9b22-7617f762b5bf" containerName="extract" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477031 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4f9d047-270a-45eb-91b0-aa724c135e67" containerName="extract" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477036 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f9d047-270a-45eb-91b0-aa724c135e67" containerName="extract" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477047 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db84943-4ae7-431d-8fce-14be8a082e9e" containerName="util" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477052 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db84943-4ae7-431d-8fce-14be8a082e9e" containerName="util" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477057 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db84943-4ae7-431d-8fce-14be8a082e9e" containerName="pull" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477061 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db84943-4ae7-431d-8fce-14be8a082e9e" containerName="pull" Apr 24 21:36:03.477058 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477068 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="208b118a-a5a4-47fc-bfae-6f2c53aaf505" containerName="util" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477073 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="208b118a-a5a4-47fc-bfae-6f2c53aaf505" containerName="util" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477078 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="208b118a-a5a4-47fc-bfae-6f2c53aaf505" containerName="pull" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477082 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="208b118a-a5a4-47fc-bfae-6f2c53aaf505" containerName="pull" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477088 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="208b118a-a5a4-47fc-bfae-6f2c53aaf505" containerName="extract" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477093 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="208b118a-a5a4-47fc-bfae-6f2c53aaf505" containerName="extract" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477097 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db84943-4ae7-431d-8fce-14be8a082e9e" containerName="extract" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477102 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db84943-4ae7-431d-8fce-14be8a082e9e" containerName="extract" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477146 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="208b118a-a5a4-47fc-bfae-6f2c53aaf505" containerName="extract" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477154 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="22efcd6c-cc94-448c-9b22-7617f762b5bf" containerName="extract" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477160 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3db84943-4ae7-431d-8fce-14be8a082e9e" containerName="extract" Apr 24 21:36:03.477545 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.477166 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4f9d047-270a-45eb-91b0-aa724c135e67" containerName="extract" Apr 24 21:36:03.479842 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.479827 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg" Apr 24 21:36:03.483749 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.483726 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-djqn7\"" Apr 24 21:36:03.483908 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.483826 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 21:36:03.483908 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.483824 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 24 21:36:03.483908 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.483854 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 21:36:03.490600 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.490559 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg"] Apr 24 21:36:03.568250 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.568223 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr7sc\" (UniqueName: \"kubernetes.io/projected/b3a28f01-9bb8-4227-9be7-74768d03fc42-kube-api-access-dr7sc\") pod \"dns-operator-controller-manager-844548ff4c-6kpsg\" (UID: \"b3a28f01-9bb8-4227-9be7-74768d03fc42\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg" Apr 24 21:36:03.668676 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.668634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr7sc\" (UniqueName: \"kubernetes.io/projected/b3a28f01-9bb8-4227-9be7-74768d03fc42-kube-api-access-dr7sc\") pod \"dns-operator-controller-manager-844548ff4c-6kpsg\" (UID: \"b3a28f01-9bb8-4227-9be7-74768d03fc42\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg" Apr 24 21:36:03.682803 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.682777 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr7sc\" (UniqueName: \"kubernetes.io/projected/b3a28f01-9bb8-4227-9be7-74768d03fc42-kube-api-access-dr7sc\") pod \"dns-operator-controller-manager-844548ff4c-6kpsg\" (UID: \"b3a28f01-9bb8-4227-9be7-74768d03fc42\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg" Apr 24 21:36:03.790726 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.790647 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg" Apr 24 21:36:03.922121 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:03.922090 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg"] Apr 24 21:36:03.923948 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:36:03.923919 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3a28f01_9bb8_4227_9be7_74768d03fc42.slice/crio-65d56712559da78481c17e0dc7c3c75d599233778bf71d8c1416044042ec7a61 WatchSource:0}: Error finding container 65d56712559da78481c17e0dc7c3c75d599233778bf71d8c1416044042ec7a61: Status 404 returned error can't find the container with id 65d56712559da78481c17e0dc7c3c75d599233778bf71d8c1416044042ec7a61 Apr 24 21:36:04.130907 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:04.130824 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg" event={"ID":"b3a28f01-9bb8-4227-9be7-74768d03fc42","Type":"ContainerStarted","Data":"65d56712559da78481c17e0dc7c3c75d599233778bf71d8c1416044042ec7a61"} Apr 24 21:36:05.575731 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:05.575701 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw"] Apr 24 21:36:05.580952 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:05.580929 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw" Apr 24 21:36:05.583197 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:05.583181 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-4twft\"" Apr 24 21:36:05.588987 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:05.588962 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw"] Apr 24 21:36:05.684206 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:05.684164 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxd9v\" (UniqueName: \"kubernetes.io/projected/194e2ea0-37e3-4ca8-81ad-841ff703c06f-kube-api-access-zxd9v\") pod \"limitador-operator-controller-manager-c7fb4c8d5-grrpw\" (UID: \"194e2ea0-37e3-4ca8-81ad-841ff703c06f\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw" Apr 24 21:36:05.785719 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:05.785684 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxd9v\" (UniqueName: \"kubernetes.io/projected/194e2ea0-37e3-4ca8-81ad-841ff703c06f-kube-api-access-zxd9v\") pod \"limitador-operator-controller-manager-c7fb4c8d5-grrpw\" (UID: \"194e2ea0-37e3-4ca8-81ad-841ff703c06f\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw" Apr 24 21:36:05.794990 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:05.794969 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxd9v\" (UniqueName: \"kubernetes.io/projected/194e2ea0-37e3-4ca8-81ad-841ff703c06f-kube-api-access-zxd9v\") pod \"limitador-operator-controller-manager-c7fb4c8d5-grrpw\" (UID: \"194e2ea0-37e3-4ca8-81ad-841ff703c06f\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw" Apr 24 21:36:05.894857 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:05.894762 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw" Apr 24 21:36:06.627728 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:06.627705 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw"] Apr 24 21:36:06.628724 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:36:06.628689 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194e2ea0_37e3_4ca8_81ad_841ff703c06f.slice/crio-01c1ca111900c141ef10e2a708457fec0dec5848b98323201dbbeccde9f9f075 WatchSource:0}: Error finding container 01c1ca111900c141ef10e2a708457fec0dec5848b98323201dbbeccde9f9f075: Status 404 returned error can't find the container with id 01c1ca111900c141ef10e2a708457fec0dec5848b98323201dbbeccde9f9f075 Apr 24 21:36:07.144970 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:07.144931 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg" event={"ID":"b3a28f01-9bb8-4227-9be7-74768d03fc42","Type":"ContainerStarted","Data":"c858485e2c91a744d44c5422adc62a1fd202fc6953b5df610b5b187879cf3621"} Apr 24 21:36:07.145166 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:07.145069 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg" Apr 24 21:36:07.146084 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:07.146061 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw" event={"ID":"194e2ea0-37e3-4ca8-81ad-841ff703c06f","Type":"ContainerStarted","Data":"01c1ca111900c141ef10e2a708457fec0dec5848b98323201dbbeccde9f9f075"} Apr 24 21:36:07.218132 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:07.218086 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg" podStartSLOduration=1.595217356 podStartE2EDuration="4.218069999s" podCreationTimestamp="2026-04-24 21:36:03 +0000 UTC" firstStartedPulling="2026-04-24 21:36:03.926114514 +0000 UTC m=+502.024527005" lastFinishedPulling="2026-04-24 21:36:06.548967141 +0000 UTC m=+504.647379648" observedRunningTime="2026-04-24 21:36:07.216339191 +0000 UTC m=+505.314751706" watchObservedRunningTime="2026-04-24 21:36:07.218069999 +0000 UTC m=+505.316482513" Apr 24 21:36:09.155350 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:09.155304 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw" event={"ID":"194e2ea0-37e3-4ca8-81ad-841ff703c06f","Type":"ContainerStarted","Data":"a66dad1a507f74771a7a3f903a0c56dc68e3f58c73b6ebf6fa569036b75f0703"} Apr 24 21:36:09.155350 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:09.155364 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw" Apr 24 21:36:09.271705 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:09.271656 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw" podStartSLOduration=2.392884876 podStartE2EDuration="4.271640275s" podCreationTimestamp="2026-04-24 21:36:05 +0000 UTC" firstStartedPulling="2026-04-24 21:36:06.631329658 +0000 UTC m=+504.729742164" lastFinishedPulling="2026-04-24 21:36:08.510085072 +0000 UTC m=+506.608497563" observedRunningTime="2026-04-24 21:36:09.268223016 +0000 UTC m=+507.366635529" watchObservedRunningTime="2026-04-24 21:36:09.271640275 +0000 UTC m=+507.370052788" Apr 24 21:36:11.956977 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:11.956946 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66"] Apr 24 21:36:11.960283 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:11.960264 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" Apr 24 21:36:11.962438 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:11.962418 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-8rmwx\"" Apr 24 21:36:11.971628 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:11.971604 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66"] Apr 24 21:36:12.043721 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:12.043694 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmpsr\" (UniqueName: \"kubernetes.io/projected/8cd01911-c5b5-43d0-884c-5a9a8f2b0c94-kube-api-access-qmpsr\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-z9p66\" (UID: \"8cd01911-c5b5-43d0-884c-5a9a8f2b0c94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" Apr 24 21:36:12.043867 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:12.043728 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8cd01911-c5b5-43d0-884c-5a9a8f2b0c94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-z9p66\" (UID: \"8cd01911-c5b5-43d0-884c-5a9a8f2b0c94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" Apr 24 21:36:12.144858 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:12.144830 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmpsr\" (UniqueName: \"kubernetes.io/projected/8cd01911-c5b5-43d0-884c-5a9a8f2b0c94-kube-api-access-qmpsr\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-z9p66\" (UID: \"8cd01911-c5b5-43d0-884c-5a9a8f2b0c94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" Apr 24 21:36:12.144987 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:12.144866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8cd01911-c5b5-43d0-884c-5a9a8f2b0c94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-z9p66\" (UID: \"8cd01911-c5b5-43d0-884c-5a9a8f2b0c94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" Apr 24 21:36:12.145287 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:12.145269 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8cd01911-c5b5-43d0-884c-5a9a8f2b0c94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-z9p66\" (UID: \"8cd01911-c5b5-43d0-884c-5a9a8f2b0c94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" Apr 24 21:36:12.162535 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:12.162514 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmpsr\" (UniqueName: \"kubernetes.io/projected/8cd01911-c5b5-43d0-884c-5a9a8f2b0c94-kube-api-access-qmpsr\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-z9p66\" (UID: \"8cd01911-c5b5-43d0-884c-5a9a8f2b0c94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" Apr 24 21:36:12.270749 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:12.270674 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" Apr 24 21:36:12.398691 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:12.398665 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66"] Apr 24 21:36:12.400762 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:36:12.400735 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd01911_c5b5_43d0_884c_5a9a8f2b0c94.slice/crio-6657ad06f55661b22b539a12f7b9b44b3d8188cd298057e431ce9c43101722ae WatchSource:0}: Error finding container 6657ad06f55661b22b539a12f7b9b44b3d8188cd298057e431ce9c43101722ae: Status 404 returned error can't find the container with id 6657ad06f55661b22b539a12f7b9b44b3d8188cd298057e431ce9c43101722ae Apr 24 21:36:13.177456 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:13.177420 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" event={"ID":"8cd01911-c5b5-43d0-884c-5a9a8f2b0c94","Type":"ContainerStarted","Data":"6657ad06f55661b22b539a12f7b9b44b3d8188cd298057e431ce9c43101722ae"} Apr 24 21:36:18.152676 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:18.152640 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-6kpsg" Apr 24 21:36:19.202843 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:19.202808 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" event={"ID":"8cd01911-c5b5-43d0-884c-5a9a8f2b0c94","Type":"ContainerStarted","Data":"4089941c89ac1af42e8e411e91582d730808d10a409792b1a27ef7e8071d80d6"} Apr 24 21:36:19.203352 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:19.203026 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" Apr 24 21:36:19.227242 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:19.227198 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" podStartSLOduration=2.445906075 podStartE2EDuration="8.227184s" podCreationTimestamp="2026-04-24 21:36:11 +0000 UTC" firstStartedPulling="2026-04-24 21:36:12.403208124 +0000 UTC m=+510.501620617" lastFinishedPulling="2026-04-24 21:36:18.184486049 +0000 UTC m=+516.282898542" observedRunningTime="2026-04-24 21:36:19.224570901 +0000 UTC m=+517.322983413" watchObservedRunningTime="2026-04-24 21:36:19.227184 +0000 UTC m=+517.325596512" Apr 24 21:36:20.161539 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:20.161507 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-grrpw" Apr 24 21:36:30.209227 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:36:30.209192 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-z9p66" Apr 24 21:37:36.118086 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.118054 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n"] Apr 24 21:37:36.121522 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.121502 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.133503 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.133480 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n"] Apr 24 21:37:36.228703 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.228666 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5669e63d-6e50-438f-8114-00a0c65b7492-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.228703 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.228704 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5669e63d-6e50-438f-8114-00a0c65b7492-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.228967 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.228724 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5669e63d-6e50-438f-8114-00a0c65b7492-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.228967 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.228803 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5669e63d-6e50-438f-8114-00a0c65b7492-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.228967 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.228832 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2t2c\" (UniqueName: \"kubernetes.io/projected/5669e63d-6e50-438f-8114-00a0c65b7492-kube-api-access-k2t2c\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.228967 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.228866 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5669e63d-6e50-438f-8114-00a0c65b7492-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.228967 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.228882 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5669e63d-6e50-438f-8114-00a0c65b7492-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.329738 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.329699 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5669e63d-6e50-438f-8114-00a0c65b7492-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.329738 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.329746 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5669e63d-6e50-438f-8114-00a0c65b7492-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.330003 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.329875 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5669e63d-6e50-438f-8114-00a0c65b7492-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.330003 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.329977 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5669e63d-6e50-438f-8114-00a0c65b7492-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.330360 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.330009 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2t2c\" (UniqueName: \"kubernetes.io/projected/5669e63d-6e50-438f-8114-00a0c65b7492-kube-api-access-k2t2c\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.330508 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.330416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5669e63d-6e50-438f-8114-00a0c65b7492-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.331220 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.331171 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5669e63d-6e50-438f-8114-00a0c65b7492-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.331367 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.331351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5669e63d-6e50-438f-8114-00a0c65b7492-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.333530 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.333495 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5669e63d-6e50-438f-8114-00a0c65b7492-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.334501 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.334476 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5669e63d-6e50-438f-8114-00a0c65b7492-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.334693 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.334660 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5669e63d-6e50-438f-8114-00a0c65b7492-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.337178 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.337156 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5669e63d-6e50-438f-8114-00a0c65b7492-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.339941 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.339919 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5669e63d-6e50-438f-8114-00a0c65b7492-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.340561 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.340540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2t2c\" (UniqueName: \"kubernetes.io/projected/5669e63d-6e50-438f-8114-00a0c65b7492-kube-api-access-k2t2c\") pod \"istiod-openshift-gateway-55ff986f96-nzp8n\" (UID: \"5669e63d-6e50-438f-8114-00a0c65b7492\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.431146 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.431116 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:36.587363 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.587333 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n"] Apr 24 21:37:36.589325 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:37:36.589298 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5669e63d_6e50_438f_8114_00a0c65b7492.slice/crio-f4f288603b15a66c3b743e420eb66d54fc5fdfc39e3762b532ace3a77c461d99 WatchSource:0}: Error finding container f4f288603b15a66c3b743e420eb66d54fc5fdfc39e3762b532ace3a77c461d99: Status 404 returned error can't find the container with id f4f288603b15a66c3b743e420eb66d54fc5fdfc39e3762b532ace3a77c461d99 Apr 24 21:37:36.592569 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.591973 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 21:37:36.592569 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:36.592045 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 21:37:37.501089 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.501052 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" event={"ID":"5669e63d-6e50-438f-8114-00a0c65b7492","Type":"ContainerStarted","Data":"ac54301f75feba46572b3a7792e928fca8edb928ff4d24b7347e4109cbfc2c17"} Apr 24 21:37:37.501089 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.501092 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" event={"ID":"5669e63d-6e50-438f-8114-00a0c65b7492","Type":"ContainerStarted","Data":"f4f288603b15a66c3b743e420eb66d54fc5fdfc39e3762b532ace3a77c461d99"} Apr 24 21:37:37.501626 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.501181 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:37.503022 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.503000 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" Apr 24 21:37:37.544327 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.544288 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nzp8n" podStartSLOduration=1.5442766049999999 podStartE2EDuration="1.544276605s" podCreationTimestamp="2026-04-24 21:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:37:37.542541544 +0000 UTC m=+595.640954083" watchObservedRunningTime="2026-04-24 21:37:37.544276605 +0000 UTC m=+595.642689117" Apr 24 21:37:37.611932 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.610628 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg"] Apr 24 21:37:37.611932 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.610908 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" podUID="89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" containerName="discovery" containerID="cri-o://e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96" gracePeriod=30 Apr 24 21:37:37.865323 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.865300 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:37:37.943884 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.943852 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-local-certs\") pod \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " Apr 24 21:37:37.944065 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.943899 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-token\") pod \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " Apr 24 21:37:37.944065 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.943932 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-ca-configmap\") pod \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " Apr 24 21:37:37.944065 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.943948 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-dns-cert\") pod \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " Apr 24 21:37:37.944065 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.943980 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-cacerts\") pod \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " Apr 24 21:37:37.944065 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.944015 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2497\" (UniqueName: \"kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-kube-api-access-x2497\") pod \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " Apr 24 21:37:37.944065 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.944051 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-kubeconfig\") pod \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\" (UID: \"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827\") " Apr 24 21:37:37.944428 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.944375 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" (UID: "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:37.944886 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.944859 2567 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-ca-configmap\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:37:37.946648 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.946574 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-kube-api-access-x2497" (OuterVolumeSpecName: "kube-api-access-x2497") pod "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" (UID: "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827"). InnerVolumeSpecName "kube-api-access-x2497". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:37.946780 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.946691 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" (UID: "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:37.947043 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.946820 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-local-certs" (OuterVolumeSpecName: "local-certs") pod "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" (UID: "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:37.947158 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.947052 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" (UID: "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:37.947158 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.947055 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-token" (OuterVolumeSpecName: "istio-token") pod "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" (UID: "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:37.947922 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:37.947898 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-cacerts" (OuterVolumeSpecName: "cacerts") pod "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" (UID: "89c3bdd4-d367-45d6-bbd4-9dd1af4ed827"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:38.046245 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.046157 2567 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-local-certs\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:37:38.046245 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.046188 2567 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-token\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:37:38.046245 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.046202 2567 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-csr-dns-cert\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:37:38.046245 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.046219 2567 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-cacerts\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:37:38.046245 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.046230 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x2497\" (UniqueName: \"kubernetes.io/projected/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-kube-api-access-x2497\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:37:38.046245 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.046242 2567 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827-istio-kubeconfig\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:37:38.505574 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.505540 2567 generic.go:358] "Generic (PLEG): container finished" podID="89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" containerID="e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96" exitCode=0 Apr 24 21:37:38.505962 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.505627 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" Apr 24 21:37:38.505962 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.505632 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" event={"ID":"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827","Type":"ContainerDied","Data":"e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96"} Apr 24 21:37:38.505962 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.505669 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg" event={"ID":"89c3bdd4-d367-45d6-bbd4-9dd1af4ed827","Type":"ContainerDied","Data":"f4c54daa7afeb081c5ee6c9df8ae52f1ce87f6f9fa3ab404234a134d452fbe1c"} Apr 24 21:37:38.505962 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.505685 2567 scope.go:117] "RemoveContainer" containerID="e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96" Apr 24 21:37:38.513803 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.513786 2567 scope.go:117] "RemoveContainer" containerID="e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96" Apr 24 21:37:38.514068 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:37:38.514047 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96\": container with ID starting with e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96 not found: ID does not exist" containerID="e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96" Apr 24 21:37:38.514130 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.514075 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96"} err="failed to get container status \"e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96\": rpc error: code = NotFound desc = could not find container \"e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96\": container with ID starting with e645665460b10f27a185e9a1f534b613a3057e4e335fe1121e2441b71daaca96 not found: ID does not exist" Apr 24 21:37:38.535456 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.535433 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg"] Apr 24 21:37:38.537759 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:38.537737 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-qkhkg"] Apr 24 21:37:40.420614 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:40.420561 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" path="/var/lib/kubelet/pods/89c3bdd4-d367-45d6-bbd4-9dd1af4ed827/volumes" Apr 24 21:37:46.536356 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.536325 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-6mhj8"] Apr 24 21:37:46.538905 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.536822 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" containerName="discovery" Apr 24 21:37:46.538905 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.536837 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" containerName="discovery" Apr 24 21:37:46.538905 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.536903 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="89c3bdd4-d367-45d6-bbd4-9dd1af4ed827" containerName="discovery" Apr 24 21:37:46.539962 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.539937 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-6mhj8" Apr 24 21:37:46.542362 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.542340 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:37:46.542729 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.542707 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:37:46.542839 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.542725 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:37:46.543227 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.543210 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bxpd9\"" Apr 24 21:37:46.549169 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.549149 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-6mhj8"] Apr 24 21:37:46.721615 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.721566 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff4jz\" (UniqueName: \"kubernetes.io/projected/7e10191e-1f10-4866-b32c-e0e834643954-kube-api-access-ff4jz\") pod \"seaweedfs-86cc847c5c-6mhj8\" (UID: \"7e10191e-1f10-4866-b32c-e0e834643954\") " pod="kserve/seaweedfs-86cc847c5c-6mhj8" Apr 24 21:37:46.721790 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.721625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7e10191e-1f10-4866-b32c-e0e834643954-data\") pod \"seaweedfs-86cc847c5c-6mhj8\" (UID: \"7e10191e-1f10-4866-b32c-e0e834643954\") " pod="kserve/seaweedfs-86cc847c5c-6mhj8" Apr 24 21:37:46.822992 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.822909 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ff4jz\" (UniqueName: \"kubernetes.io/projected/7e10191e-1f10-4866-b32c-e0e834643954-kube-api-access-ff4jz\") pod \"seaweedfs-86cc847c5c-6mhj8\" (UID: \"7e10191e-1f10-4866-b32c-e0e834643954\") " pod="kserve/seaweedfs-86cc847c5c-6mhj8" Apr 24 21:37:46.822992 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.822949 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7e10191e-1f10-4866-b32c-e0e834643954-data\") pod \"seaweedfs-86cc847c5c-6mhj8\" (UID: \"7e10191e-1f10-4866-b32c-e0e834643954\") " pod="kserve/seaweedfs-86cc847c5c-6mhj8" Apr 24 21:37:46.823336 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.823315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7e10191e-1f10-4866-b32c-e0e834643954-data\") pod \"seaweedfs-86cc847c5c-6mhj8\" (UID: \"7e10191e-1f10-4866-b32c-e0e834643954\") " pod="kserve/seaweedfs-86cc847c5c-6mhj8" Apr 24 21:37:46.832970 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.832944 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff4jz\" (UniqueName: \"kubernetes.io/projected/7e10191e-1f10-4866-b32c-e0e834643954-kube-api-access-ff4jz\") pod \"seaweedfs-86cc847c5c-6mhj8\" (UID: \"7e10191e-1f10-4866-b32c-e0e834643954\") " pod="kserve/seaweedfs-86cc847c5c-6mhj8" Apr 24 21:37:46.852619 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.852596 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-6mhj8" Apr 24 21:37:46.978343 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:46.978320 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-6mhj8"] Apr 24 21:37:46.980216 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:37:46.980190 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e10191e_1f10_4866_b32c_e0e834643954.slice/crio-b0644d937c5f856757205b282c859dc4eb8a1f1350c74edfd35f420fa95f5acc WatchSource:0}: Error finding container b0644d937c5f856757205b282c859dc4eb8a1f1350c74edfd35f420fa95f5acc: Status 404 returned error can't find the container with id b0644d937c5f856757205b282c859dc4eb8a1f1350c74edfd35f420fa95f5acc Apr 24 21:37:47.541573 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:47.541531 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-6mhj8" event={"ID":"7e10191e-1f10-4866-b32c-e0e834643954","Type":"ContainerStarted","Data":"b0644d937c5f856757205b282c859dc4eb8a1f1350c74edfd35f420fa95f5acc"} Apr 24 21:37:49.551944 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:49.551905 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-6mhj8" event={"ID":"7e10191e-1f10-4866-b32c-e0e834643954","Type":"ContainerStarted","Data":"650a7deda9b7f89d332794a99bf90753caa5ed9b4876ed9b932dd6ab499e5c1e"} Apr 24 21:37:49.552398 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:49.552062 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-6mhj8" Apr 24 21:37:55.558059 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:55.558028 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-6mhj8" Apr 24 21:37:55.577425 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:37:55.577371 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-6mhj8" podStartSLOduration=7.108212764 podStartE2EDuration="9.577358939s" podCreationTimestamp="2026-04-24 21:37:46 +0000 UTC" firstStartedPulling="2026-04-24 21:37:46.98160618 +0000 UTC m=+605.080018683" lastFinishedPulling="2026-04-24 21:37:49.45075235 +0000 UTC m=+607.549164858" observedRunningTime="2026-04-24 21:37:49.572959262 +0000 UTC m=+607.671371776" watchObservedRunningTime="2026-04-24 21:37:55.577358939 +0000 UTC m=+613.675771452" Apr 24 21:38:56.891846 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.891814 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-cm4l4"] Apr 24 21:38:56.900843 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.900814 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:38:56.904907 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.904372 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-9vv7j\"" Apr 24 21:38:56.905047 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.904901 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:38:56.905446 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.905170 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-cm4l4"] Apr 24 21:38:56.908276 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.908253 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-72h9j"] Apr 24 21:38:56.911625 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.911606 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:38:56.914024 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.913865 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:38:56.914549 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.914264 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-68xkk\"" Apr 24 21:38:56.921353 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.921337 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-72h9j"] Apr 24 21:38:56.976750 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.976723 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81162468-e2f1-4fcc-bd24-61e23bdfd1c6-cert\") pod \"odh-model-controller-696fc77849-72h9j\" (UID: \"81162468-e2f1-4fcc-bd24-61e23bdfd1c6\") " pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:38:56.976911 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.976757 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm8tp\" (UniqueName: \"kubernetes.io/projected/81162468-e2f1-4fcc-bd24-61e23bdfd1c6-kube-api-access-lm8tp\") pod \"odh-model-controller-696fc77849-72h9j\" (UID: \"81162468-e2f1-4fcc-bd24-61e23bdfd1c6\") " pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:38:56.976911 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.976881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc-tls-certs\") pod \"model-serving-api-86f7b4b499-cm4l4\" (UID: \"d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc\") " pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:38:56.977005 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:56.976922 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6dwq\" (UniqueName: \"kubernetes.io/projected/d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc-kube-api-access-z6dwq\") pod \"model-serving-api-86f7b4b499-cm4l4\" (UID: \"d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc\") " pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:38:57.077575 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.077543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81162468-e2f1-4fcc-bd24-61e23bdfd1c6-cert\") pod \"odh-model-controller-696fc77849-72h9j\" (UID: \"81162468-e2f1-4fcc-bd24-61e23bdfd1c6\") " pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:38:57.077738 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.077575 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm8tp\" (UniqueName: \"kubernetes.io/projected/81162468-e2f1-4fcc-bd24-61e23bdfd1c6-kube-api-access-lm8tp\") pod \"odh-model-controller-696fc77849-72h9j\" (UID: \"81162468-e2f1-4fcc-bd24-61e23bdfd1c6\") " pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:38:57.077738 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.077666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc-tls-certs\") pod \"model-serving-api-86f7b4b499-cm4l4\" (UID: \"d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc\") " pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:38:57.077738 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.077698 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6dwq\" (UniqueName: \"kubernetes.io/projected/d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc-kube-api-access-z6dwq\") pod \"model-serving-api-86f7b4b499-cm4l4\" (UID: \"d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc\") " pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:38:57.077738 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:38:57.077709 2567 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 21:38:57.077958 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:38:57.077785 2567 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 21:38:57.077958 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:38:57.077788 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81162468-e2f1-4fcc-bd24-61e23bdfd1c6-cert podName:81162468-e2f1-4fcc-bd24-61e23bdfd1c6 nodeName:}" failed. No retries permitted until 2026-04-24 21:38:57.577767402 +0000 UTC m=+675.676179900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/81162468-e2f1-4fcc-bd24-61e23bdfd1c6-cert") pod "odh-model-controller-696fc77849-72h9j" (UID: "81162468-e2f1-4fcc-bd24-61e23bdfd1c6") : secret "odh-model-controller-webhook-cert" not found Apr 24 21:38:57.077958 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:38:57.077870 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc-tls-certs podName:d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc nodeName:}" failed. No retries permitted until 2026-04-24 21:38:57.57785172 +0000 UTC m=+675.676264216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc-tls-certs") pod "model-serving-api-86f7b4b499-cm4l4" (UID: "d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc") : secret "model-serving-api-tls" not found Apr 24 21:38:57.090700 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.090675 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm8tp\" (UniqueName: \"kubernetes.io/projected/81162468-e2f1-4fcc-bd24-61e23bdfd1c6-kube-api-access-lm8tp\") pod \"odh-model-controller-696fc77849-72h9j\" (UID: \"81162468-e2f1-4fcc-bd24-61e23bdfd1c6\") " pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:38:57.092367 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.092342 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6dwq\" (UniqueName: \"kubernetes.io/projected/d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc-kube-api-access-z6dwq\") pod \"model-serving-api-86f7b4b499-cm4l4\" (UID: \"d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc\") " pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:38:57.582532 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.582502 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc-tls-certs\") pod \"model-serving-api-86f7b4b499-cm4l4\" (UID: \"d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc\") " pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:38:57.582710 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.582563 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81162468-e2f1-4fcc-bd24-61e23bdfd1c6-cert\") pod \"odh-model-controller-696fc77849-72h9j\" (UID: \"81162468-e2f1-4fcc-bd24-61e23bdfd1c6\") " pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:38:57.584846 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.584819 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc-tls-certs\") pod \"model-serving-api-86f7b4b499-cm4l4\" (UID: \"d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc\") " pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:38:57.584959 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.584856 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81162468-e2f1-4fcc-bd24-61e23bdfd1c6-cert\") pod \"odh-model-controller-696fc77849-72h9j\" (UID: \"81162468-e2f1-4fcc-bd24-61e23bdfd1c6\") " pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:38:57.815455 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.815427 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:38:57.827102 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.827076 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:38:57.950609 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.950560 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-cm4l4"] Apr 24 21:38:57.954133 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:38:57.954097 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2b3a923_58c7_4ed9_8771_ad1bcc7de4fc.slice/crio-2b56030c7dd41ad85b84836f11948839875207b30b3c79993df73c452c4d1d3c WatchSource:0}: Error finding container 2b56030c7dd41ad85b84836f11948839875207b30b3c79993df73c452c4d1d3c: Status 404 returned error can't find the container with id 2b56030c7dd41ad85b84836f11948839875207b30b3c79993df73c452c4d1d3c Apr 24 21:38:57.956329 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.956313 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:38:57.975751 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:57.975730 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-72h9j"] Apr 24 21:38:57.976458 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:38:57.976438 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81162468_e2f1_4fcc_bd24_61e23bdfd1c6.slice/crio-45dc3193a32fe0579fc5af0ac34dcbb81614cb96f9d2999dc4388e3d54511dea WatchSource:0}: Error finding container 45dc3193a32fe0579fc5af0ac34dcbb81614cb96f9d2999dc4388e3d54511dea: Status 404 returned error can't find the container with id 45dc3193a32fe0579fc5af0ac34dcbb81614cb96f9d2999dc4388e3d54511dea Apr 24 21:38:58.813381 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:58.813344 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-cm4l4" event={"ID":"d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc","Type":"ContainerStarted","Data":"2b56030c7dd41ad85b84836f11948839875207b30b3c79993df73c452c4d1d3c"} Apr 24 21:38:58.816335 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:38:58.816282 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-72h9j" event={"ID":"81162468-e2f1-4fcc-bd24-61e23bdfd1c6","Type":"ContainerStarted","Data":"45dc3193a32fe0579fc5af0ac34dcbb81614cb96f9d2999dc4388e3d54511dea"} Apr 24 21:39:01.830523 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:01.830416 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-cm4l4" event={"ID":"d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc","Type":"ContainerStarted","Data":"a2e6f4c9dea96f67aeb73dd98fdef76c8710b127db81a1c0718ee71d86b5842d"} Apr 24 21:39:01.830523 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:01.830508 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:39:01.831833 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:01.831810 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-72h9j" event={"ID":"81162468-e2f1-4fcc-bd24-61e23bdfd1c6","Type":"ContainerStarted","Data":"5d22797cd9497bb4051bd78ed0bd3faf888588cb5b0be47717d56e04834a1d0a"} Apr 24 21:39:01.831929 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:01.831897 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:39:01.864719 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:01.864660 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-72h9j" podStartSLOduration=2.271875668 podStartE2EDuration="5.864642212s" podCreationTimestamp="2026-04-24 21:38:56 +0000 UTC" firstStartedPulling="2026-04-24 21:38:57.979492599 +0000 UTC m=+676.077905099" lastFinishedPulling="2026-04-24 21:39:01.572259149 +0000 UTC m=+679.670671643" observedRunningTime="2026-04-24 21:39:01.864566087 +0000 UTC m=+679.962978600" watchObservedRunningTime="2026-04-24 21:39:01.864642212 +0000 UTC m=+679.963054730" Apr 24 21:39:01.867660 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:01.867601 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-cm4l4" podStartSLOduration=2.194901088 podStartE2EDuration="5.867565119s" podCreationTimestamp="2026-04-24 21:38:56 +0000 UTC" firstStartedPulling="2026-04-24 21:38:57.956429626 +0000 UTC m=+676.054842117" lastFinishedPulling="2026-04-24 21:39:01.629093643 +0000 UTC m=+679.727506148" observedRunningTime="2026-04-24 21:39:01.848375846 +0000 UTC m=+679.946788359" watchObservedRunningTime="2026-04-24 21:39:01.867565119 +0000 UTC m=+679.965977633" Apr 24 21:39:12.839635 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:12.839576 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-72h9j" Apr 24 21:39:12.841553 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:12.841532 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-cm4l4" Apr 24 21:39:34.282523 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.282428 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6"] Apr 24 21:39:34.291249 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.291227 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.296539 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.296512 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-7ptgn\"" Apr 24 21:39:34.296950 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.296930 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:39:34.297433 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.297414 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 24 21:39:34.297906 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.297889 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:39:34.304454 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.304430 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6"] Apr 24 21:39:34.414427 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.414385 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7d5accb4-791f-4ad1-abed-f31b4c4de217-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.414627 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.414471 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.414627 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.414504 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr9tj\" (UniqueName: \"kubernetes.io/projected/7d5accb4-791f-4ad1-abed-f31b4c4de217-kube-api-access-zr9tj\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.414627 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.414575 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.414817 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.414642 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.414817 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.414734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.414817 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.414774 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.414951 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.414826 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.414951 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.414854 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.516337 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516299 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.516515 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516349 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.516515 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516391 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7d5accb4-791f-4ad1-abed-f31b4c4de217-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.516515 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.516515 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516464 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr9tj\" (UniqueName: \"kubernetes.io/projected/7d5accb4-791f-4ad1-abed-f31b4c4de217-kube-api-access-zr9tj\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.516515 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516495 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.516816 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516532 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.516816 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516575 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.516816 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516627 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.516816 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516757 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.517092 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.516945 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.517179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.517158 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.517288 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.517265 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.517814 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.517788 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7d5accb4-791f-4ad1-abed-f31b4c4de217-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.518961 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.518935 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.519220 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.519200 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.525763 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.525718 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7d5accb4-791f-4ad1-abed-f31b4c4de217-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.526087 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.526069 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr9tj\" (UniqueName: \"kubernetes.io/projected/7d5accb4-791f-4ad1-abed-f31b4c4de217-kube-api-access-zr9tj\") pod \"router-gateway-1-openshift-default-6c59fbf55c-ngwj6\" (UID: \"7d5accb4-791f-4ad1-abed-f31b4c4de217\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.605164 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.605078 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:34.761179 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.761143 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6"] Apr 24 21:39:34.761925 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:39:34.761896 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d5accb4_791f_4ad1_abed_f31b4c4de217.slice/crio-e5f3d3dcf77a539c9a0fe40876484faa4a58ae0dbfbd1b5a6709560bc3d454e6 WatchSource:0}: Error finding container e5f3d3dcf77a539c9a0fe40876484faa4a58ae0dbfbd1b5a6709560bc3d454e6: Status 404 returned error can't find the container with id e5f3d3dcf77a539c9a0fe40876484faa4a58ae0dbfbd1b5a6709560bc3d454e6 Apr 24 21:39:34.764387 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.764353 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 21:39:34.764476 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.764416 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 21:39:34.764476 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.764445 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 21:39:34.957909 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.957865 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" event={"ID":"7d5accb4-791f-4ad1-abed-f31b4c4de217","Type":"ContainerStarted","Data":"f0a944222edfe33bfe974820ff20e50343cc80250ff30fc9c7b4d6dde350184c"} Apr 24 21:39:34.958054 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.957915 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" event={"ID":"7d5accb4-791f-4ad1-abed-f31b4c4de217","Type":"ContainerStarted","Data":"e5f3d3dcf77a539c9a0fe40876484faa4a58ae0dbfbd1b5a6709560bc3d454e6"} Apr 24 21:39:34.982767 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:34.982677 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" podStartSLOduration=0.982658475 podStartE2EDuration="982.658475ms" podCreationTimestamp="2026-04-24 21:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:34.980913547 +0000 UTC m=+713.079326059" watchObservedRunningTime="2026-04-24 21:39:34.982658475 +0000 UTC m=+713.081070989" Apr 24 21:39:35.605598 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:35.605558 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:35.610457 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:35.610432 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:35.963937 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:35.963901 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:35.965664 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:35.965640 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-ngwj6" Apr 24 21:39:45.519125 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.519086 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl"] Apr 24 21:39:45.521667 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.521650 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.523972 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.523949 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:39:45.524898 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.524878 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec0c69dceeb48768325d1a53a749e65786-kserve-self-signed-certs\"" Apr 24 21:39:45.536730 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.536710 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl"] Apr 24 21:39:45.613397 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.613356 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.613554 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.613413 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.613554 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.613490 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.613554 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.613531 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk57c\" (UniqueName: \"kubernetes.io/projected/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kube-api-access-sk57c\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.613689 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.613573 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.613689 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.613606 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.613689 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.613629 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.714429 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.714387 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.714649 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.714452 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.714649 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.714512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.714649 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.714556 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sk57c\" (UniqueName: \"kubernetes.io/projected/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kube-api-access-sk57c\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.714649 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.714642 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.714889 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.714668 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.714889 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.714701 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.714889 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.714840 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.715065 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.714898 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.715065 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.715023 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.715233 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.715209 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.716904 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.716879 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.717074 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.717054 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.727139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.727117 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk57c\" (UniqueName: \"kubernetes.io/projected/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kube-api-access-sk57c\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.831806 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.831715 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:39:45.967800 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:45.967770 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl"] Apr 24 21:39:45.969007 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:39:45.968977 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e4543c8_70a8_4d33_9648_6b01a5c819ff.slice/crio-da5a49d4690d6c03369b7320e5752b370ef016a67cf32cce3b64af74019c2d18 WatchSource:0}: Error finding container da5a49d4690d6c03369b7320e5752b370ef016a67cf32cce3b64af74019c2d18: Status 404 returned error can't find the container with id da5a49d4690d6c03369b7320e5752b370ef016a67cf32cce3b64af74019c2d18 Apr 24 21:39:46.000927 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:46.000887 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" event={"ID":"2e4543c8-70a8-4d33-9648-6b01a5c819ff","Type":"ContainerStarted","Data":"da5a49d4690d6c03369b7320e5752b370ef016a67cf32cce3b64af74019c2d18"} Apr 24 21:39:50.026550 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:50.026502 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" event={"ID":"2e4543c8-70a8-4d33-9648-6b01a5c819ff","Type":"ContainerStarted","Data":"704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d"} Apr 24 21:39:57.053869 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:57.053833 2567 generic.go:358] "Generic (PLEG): container finished" podID="2e4543c8-70a8-4d33-9648-6b01a5c819ff" containerID="704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d" exitCode=0 Apr 24 21:39:57.054277 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:57.053908 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" event={"ID":"2e4543c8-70a8-4d33-9648-6b01a5c819ff","Type":"ContainerDied","Data":"704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d"} Apr 24 21:39:59.064232 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:59.064197 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" event={"ID":"2e4543c8-70a8-4d33-9648-6b01a5c819ff","Type":"ContainerStarted","Data":"14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858"} Apr 24 21:39:59.088225 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:39:59.088177 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" podStartSLOduration=1.8875123679999999 podStartE2EDuration="14.0881624s" podCreationTimestamp="2026-04-24 21:39:45 +0000 UTC" firstStartedPulling="2026-04-24 21:39:45.971017597 +0000 UTC m=+724.069430088" lastFinishedPulling="2026-04-24 21:39:58.171667629 +0000 UTC m=+736.270080120" observedRunningTime="2026-04-24 21:39:59.086013144 +0000 UTC m=+737.184425656" watchObservedRunningTime="2026-04-24 21:39:59.0881624 +0000 UTC m=+737.186574917" Apr 24 21:40:02.187008 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.186976 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl"] Apr 24 21:40:02.187443 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.187256 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" podUID="2e4543c8-70a8-4d33-9648-6b01a5c819ff" containerName="main" containerID="cri-o://14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858" gracePeriod=30 Apr 24 21:40:02.508511 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.508487 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:40:02.565721 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.565690 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kserve-provision-location\") pod \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " Apr 24 21:40:02.565870 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.565748 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tls-certs\") pod \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " Apr 24 21:40:02.565870 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.565784 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tmp-dir\") pod \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " Apr 24 21:40:02.565870 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.565852 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-home\") pod \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " Apr 24 21:40:02.566024 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.565904 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-model-cache\") pod \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " Apr 24 21:40:02.566024 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.565931 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk57c\" (UniqueName: \"kubernetes.io/projected/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kube-api-access-sk57c\") pod \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " Apr 24 21:40:02.566024 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.565966 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-dshm\") pod \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\" (UID: \"2e4543c8-70a8-4d33-9648-6b01a5c819ff\") " Apr 24 21:40:02.566178 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.566037 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-home" (OuterVolumeSpecName: "home") pod "2e4543c8-70a8-4d33-9648-6b01a5c819ff" (UID: "2e4543c8-70a8-4d33-9648-6b01a5c819ff"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:02.566178 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.566069 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "2e4543c8-70a8-4d33-9648-6b01a5c819ff" (UID: "2e4543c8-70a8-4d33-9648-6b01a5c819ff"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:02.566274 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.566174 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-model-cache" (OuterVolumeSpecName: "model-cache") pod "2e4543c8-70a8-4d33-9648-6b01a5c819ff" (UID: "2e4543c8-70a8-4d33-9648-6b01a5c819ff"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:02.566573 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.566335 2567 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tmp-dir\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:02.566573 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.566358 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-home\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:02.566573 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.566371 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-model-cache\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:02.568428 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.568407 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-dshm" (OuterVolumeSpecName: "dshm") pod "2e4543c8-70a8-4d33-9648-6b01a5c819ff" (UID: "2e4543c8-70a8-4d33-9648-6b01a5c819ff"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:02.568509 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.568447 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kube-api-access-sk57c" (OuterVolumeSpecName: "kube-api-access-sk57c") pod "2e4543c8-70a8-4d33-9648-6b01a5c819ff" (UID: "2e4543c8-70a8-4d33-9648-6b01a5c819ff"). InnerVolumeSpecName "kube-api-access-sk57c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:02.569009 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.568990 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2e4543c8-70a8-4d33-9648-6b01a5c819ff" (UID: "2e4543c8-70a8-4d33-9648-6b01a5c819ff"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:02.664388 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.664352 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2e4543c8-70a8-4d33-9648-6b01a5c819ff" (UID: "2e4543c8-70a8-4d33-9648-6b01a5c819ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:02.666920 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.666899 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sk57c\" (UniqueName: \"kubernetes.io/projected/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kube-api-access-sk57c\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:02.667015 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.666922 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-dshm\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:02.667015 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.666935 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e4543c8-70a8-4d33-9648-6b01a5c819ff-kserve-provision-location\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:02.667015 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:02.666945 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e4543c8-70a8-4d33-9648-6b01a5c819ff-tls-certs\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:03.088535 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.088504 2567 generic.go:358] "Generic (PLEG): container finished" podID="2e4543c8-70a8-4d33-9648-6b01a5c819ff" containerID="14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858" exitCode=0 Apr 24 21:40:03.088726 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.088572 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" Apr 24 21:40:03.088726 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.088611 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" event={"ID":"2e4543c8-70a8-4d33-9648-6b01a5c819ff","Type":"ContainerDied","Data":"14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858"} Apr 24 21:40:03.088726 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.088653 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl" event={"ID":"2e4543c8-70a8-4d33-9648-6b01a5c819ff","Type":"ContainerDied","Data":"da5a49d4690d6c03369b7320e5752b370ef016a67cf32cce3b64af74019c2d18"} Apr 24 21:40:03.088726 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.088672 2567 scope.go:117] "RemoveContainer" containerID="14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858" Apr 24 21:40:03.102603 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.102512 2567 scope.go:117] "RemoveContainer" containerID="704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d" Apr 24 21:40:03.127616 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.127573 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl"] Apr 24 21:40:03.130627 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.130606 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-7668df57cfn4lgl"] Apr 24 21:40:03.222384 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.222248 2567 scope.go:117] "RemoveContainer" containerID="14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858" Apr 24 21:40:03.222894 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:40:03.222868 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858\": container with ID starting with 14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858 not found: ID does not exist" containerID="14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858" Apr 24 21:40:03.223046 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.222930 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858"} err="failed to get container status \"14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858\": rpc error: code = NotFound desc = could not find container \"14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858\": container with ID starting with 14f04a58f21eebfbbaab632c24128303df4972a4bcc93e05b4730991a85da858 not found: ID does not exist" Apr 24 21:40:03.223046 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.222992 2567 scope.go:117] "RemoveContainer" containerID="704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d" Apr 24 21:40:03.223334 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:40:03.223318 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d\": container with ID starting with 704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d not found: ID does not exist" containerID="704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d" Apr 24 21:40:03.223411 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:03.223378 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d"} err="failed to get container status \"704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d\": rpc error: code = NotFound desc = could not find container \"704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d\": container with ID starting with 704e810e15f786d7b521820d025d17c4e3e9f18c4542a3d729333353f158600d not found: ID does not exist" Apr 24 21:40:04.418158 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:04.418121 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4543c8-70a8-4d33-9648-6b01a5c819ff" path="/var/lib/kubelet/pods/2e4543c8-70a8-4d33-9648-6b01a5c819ff/volumes" Apr 24 21:40:15.214284 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.214253 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p"] Apr 24 21:40:15.214665 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.214651 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e4543c8-70a8-4d33-9648-6b01a5c819ff" containerName="storage-initializer" Apr 24 21:40:15.214713 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.214666 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4543c8-70a8-4d33-9648-6b01a5c819ff" containerName="storage-initializer" Apr 24 21:40:15.214713 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.214691 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e4543c8-70a8-4d33-9648-6b01a5c819ff" containerName="main" Apr 24 21:40:15.214713 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.214697 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4543c8-70a8-4d33-9648-6b01a5c819ff" containerName="main" Apr 24 21:40:15.214802 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.214766 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e4543c8-70a8-4d33-9648-6b01a5c819ff" containerName="main" Apr 24 21:40:15.218774 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.218758 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.221967 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.221946 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:40:15.222085 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.221973 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec2774c263d49959f50d9eebc552e13bf9-kserve-self-signed-certs\"" Apr 24 21:40:15.227487 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.227462 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p"] Apr 24 21:40:15.378800 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.378768 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.378979 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.378811 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.378979 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.378848 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.378979 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.378876 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzrfn\" (UniqueName: \"kubernetes.io/projected/21e9df7f-2383-40cc-8982-14ab1eba448c-kube-api-access-lzrfn\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.378979 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.378896 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21e9df7f-2383-40cc-8982-14ab1eba448c-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.379155 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.379009 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.379155 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.379084 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480111 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480028 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480111 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480342 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480125 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480342 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480342 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzrfn\" (UniqueName: \"kubernetes.io/projected/21e9df7f-2383-40cc-8982-14ab1eba448c-kube-api-access-lzrfn\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480342 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480199 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21e9df7f-2383-40cc-8982-14ab1eba448c-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480342 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480235 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480664 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480631 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480785 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480713 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480842 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480784 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.480882 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.480848 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.482475 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.482456 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.482869 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.482849 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21e9df7f-2383-40cc-8982-14ab1eba448c-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.488529 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.488505 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzrfn\" (UniqueName: \"kubernetes.io/projected/21e9df7f-2383-40cc-8982-14ab1eba448c-kube-api-access-lzrfn\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.529572 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.529548 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:15.664534 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:15.664503 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p"] Apr 24 21:40:15.666464 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:40:15.666439 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e9df7f_2383_40cc_8982_14ab1eba448c.slice/crio-0713375b037a8af90a46ab4f10ef1c70a301e5793a465b18b6520b0e7fc74843 WatchSource:0}: Error finding container 0713375b037a8af90a46ab4f10ef1c70a301e5793a465b18b6520b0e7fc74843: Status 404 returned error can't find the container with id 0713375b037a8af90a46ab4f10ef1c70a301e5793a465b18b6520b0e7fc74843 Apr 24 21:40:16.146246 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:16.146158 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" event={"ID":"21e9df7f-2383-40cc-8982-14ab1eba448c","Type":"ContainerStarted","Data":"0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3"} Apr 24 21:40:16.146246 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:16.146195 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" event={"ID":"21e9df7f-2383-40cc-8982-14ab1eba448c","Type":"ContainerStarted","Data":"0713375b037a8af90a46ab4f10ef1c70a301e5793a465b18b6520b0e7fc74843"} Apr 24 21:40:24.828001 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:24.827965 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p"] Apr 24 21:40:24.828420 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:24.828223 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" podUID="21e9df7f-2383-40cc-8982-14ab1eba448c" containerName="storage-initializer" containerID="cri-o://0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3" gracePeriod=30 Apr 24 21:40:54.999010 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:54.998985 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p_21e9df7f-2383-40cc-8982-14ab1eba448c/storage-initializer/0.log" Apr 24 21:40:54.999353 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:54.999057 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:55.118508 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.118429 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-home\") pod \"21e9df7f-2383-40cc-8982-14ab1eba448c\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " Apr 24 21:40:55.118508 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.118461 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-kserve-provision-location\") pod \"21e9df7f-2383-40cc-8982-14ab1eba448c\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " Apr 24 21:40:55.118508 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.118490 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-tmp-dir\") pod \"21e9df7f-2383-40cc-8982-14ab1eba448c\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " Apr 24 21:40:55.118861 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.118526 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzrfn\" (UniqueName: \"kubernetes.io/projected/21e9df7f-2383-40cc-8982-14ab1eba448c-kube-api-access-lzrfn\") pod \"21e9df7f-2383-40cc-8982-14ab1eba448c\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " Apr 24 21:40:55.118861 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.118546 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-dshm\") pod \"21e9df7f-2383-40cc-8982-14ab1eba448c\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " Apr 24 21:40:55.118861 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.118612 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21e9df7f-2383-40cc-8982-14ab1eba448c-tls-certs\") pod \"21e9df7f-2383-40cc-8982-14ab1eba448c\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " Apr 24 21:40:55.118861 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.118644 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-model-cache\") pod \"21e9df7f-2383-40cc-8982-14ab1eba448c\" (UID: \"21e9df7f-2383-40cc-8982-14ab1eba448c\") " Apr 24 21:40:55.118861 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.118755 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-home" (OuterVolumeSpecName: "home") pod "21e9df7f-2383-40cc-8982-14ab1eba448c" (UID: "21e9df7f-2383-40cc-8982-14ab1eba448c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:55.119091 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.118883 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "21e9df7f-2383-40cc-8982-14ab1eba448c" (UID: "21e9df7f-2383-40cc-8982-14ab1eba448c"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:55.119091 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.118913 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-model-cache" (OuterVolumeSpecName: "model-cache") pod "21e9df7f-2383-40cc-8982-14ab1eba448c" (UID: "21e9df7f-2383-40cc-8982-14ab1eba448c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:55.119091 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.119009 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-home\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:55.119091 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.119032 2567 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-tmp-dir\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:55.120856 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.120821 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e9df7f-2383-40cc-8982-14ab1eba448c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "21e9df7f-2383-40cc-8982-14ab1eba448c" (UID: "21e9df7f-2383-40cc-8982-14ab1eba448c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:55.120982 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.120864 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-dshm" (OuterVolumeSpecName: "dshm") pod "21e9df7f-2383-40cc-8982-14ab1eba448c" (UID: "21e9df7f-2383-40cc-8982-14ab1eba448c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:55.120982 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.120964 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e9df7f-2383-40cc-8982-14ab1eba448c-kube-api-access-lzrfn" (OuterVolumeSpecName: "kube-api-access-lzrfn") pod "21e9df7f-2383-40cc-8982-14ab1eba448c" (UID: "21e9df7f-2383-40cc-8982-14ab1eba448c"). InnerVolumeSpecName "kube-api-access-lzrfn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:55.137316 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.137286 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "21e9df7f-2383-40cc-8982-14ab1eba448c" (UID: "21e9df7f-2383-40cc-8982-14ab1eba448c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:55.220053 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.220013 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lzrfn\" (UniqueName: \"kubernetes.io/projected/21e9df7f-2383-40cc-8982-14ab1eba448c-kube-api-access-lzrfn\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:55.220053 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.220047 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-dshm\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:55.220253 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.220062 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21e9df7f-2383-40cc-8982-14ab1eba448c-tls-certs\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:55.220253 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.220075 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-model-cache\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:55.220253 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.220089 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21e9df7f-2383-40cc-8982-14ab1eba448c-kserve-provision-location\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:40:55.299827 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.299796 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p_21e9df7f-2383-40cc-8982-14ab1eba448c/storage-initializer/0.log" Apr 24 21:40:55.299984 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.299843 2567 generic.go:358] "Generic (PLEG): container finished" podID="21e9df7f-2383-40cc-8982-14ab1eba448c" containerID="0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3" exitCode=137 Apr 24 21:40:55.299984 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.299908 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" Apr 24 21:40:55.299984 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.299931 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" event={"ID":"21e9df7f-2383-40cc-8982-14ab1eba448c","Type":"ContainerDied","Data":"0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3"} Apr 24 21:40:55.299984 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.299972 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p" event={"ID":"21e9df7f-2383-40cc-8982-14ab1eba448c","Type":"ContainerDied","Data":"0713375b037a8af90a46ab4f10ef1c70a301e5793a465b18b6520b0e7fc74843"} Apr 24 21:40:55.300205 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.299988 2567 scope.go:117] "RemoveContainer" containerID="0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3" Apr 24 21:40:55.331020 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.330982 2567 scope.go:117] "RemoveContainer" containerID="0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3" Apr 24 21:40:55.331952 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:40:55.331925 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3\": container with ID starting with 0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3 not found: ID does not exist" containerID="0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3" Apr 24 21:40:55.332068 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.331986 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3"} err="failed to get container status \"0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3\": rpc error: code = NotFound desc = could not find container \"0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3\": container with ID starting with 0fae51a5a1f06fa4c6e3ea28404af2b05b91251ec5e7c06f59fc7dc698db71b3 not found: ID does not exist" Apr 24 21:40:55.342704 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.342681 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p"] Apr 24 21:40:55.350277 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:55.350253 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-6bfb88bbd4zdg9p"] Apr 24 21:40:56.417195 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:40:56.417161 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e9df7f-2383-40cc-8982-14ab1eba448c" path="/var/lib/kubelet/pods/21e9df7f-2383-40cc-8982-14ab1eba448c/volumes" Apr 24 21:42:37.021880 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.021804 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9"] Apr 24 21:42:37.022357 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.022155 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21e9df7f-2383-40cc-8982-14ab1eba448c" containerName="storage-initializer" Apr 24 21:42:37.022357 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.022166 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e9df7f-2383-40cc-8982-14ab1eba448c" containerName="storage-initializer" Apr 24 21:42:37.022357 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.022226 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="21e9df7f-2383-40cc-8982-14ab1eba448c" containerName="storage-initializer" Apr 24 21:42:37.025377 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.025355 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.028788 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.028764 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:42:37.028900 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.028789 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 24 21:42:37.033822 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.033800 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9"] Apr 24 21:42:37.078667 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.078624 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-dshm\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.078826 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.078688 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.078826 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.078739 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.078965 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.078813 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-home\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.078965 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.078865 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.078965 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.078891 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4fr8\" (UniqueName: \"kubernetes.io/projected/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kube-api-access-n4fr8\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.078965 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.078916 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.179405 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.179366 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.179619 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.179423 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4fr8\" (UniqueName: \"kubernetes.io/projected/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kube-api-access-n4fr8\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.179619 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.179455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.179619 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.179491 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-dshm\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.179619 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.179610 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.179965 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.179935 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.180135 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.179972 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.180135 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.180050 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-home\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.180454 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.180433 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-home\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.180629 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.180448 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.180832 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.180802 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.182752 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.182734 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-dshm\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.183015 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.182999 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.189793 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.189774 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4fr8\" (UniqueName: \"kubernetes.io/projected/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kube-api-access-n4fr8\") pod \"scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.361080 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.360991 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:42:37.698385 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:37.698360 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9"] Apr 24 21:42:37.699924 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:42:37.699893 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod402562a4_dd8c_4ce1_9a54_c600c760b4bb.slice/crio-5d69e97d20a7d135b4f9a48337604f26200a62c75627144265576159fda010c0 WatchSource:0}: Error finding container 5d69e97d20a7d135b4f9a48337604f26200a62c75627144265576159fda010c0: Status 404 returned error can't find the container with id 5d69e97d20a7d135b4f9a48337604f26200a62c75627144265576159fda010c0 Apr 24 21:42:38.684699 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:38.684612 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" event={"ID":"402562a4-dd8c-4ce1-9a54-c600c760b4bb","Type":"ContainerStarted","Data":"bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad"} Apr 24 21:42:38.684699 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:42:38.684656 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" event={"ID":"402562a4-dd8c-4ce1-9a54-c600c760b4bb","Type":"ContainerStarted","Data":"5d69e97d20a7d135b4f9a48337604f26200a62c75627144265576159fda010c0"} Apr 24 21:43:45.949537 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:43:45.949443 2567 generic.go:358] "Generic (PLEG): container finished" podID="402562a4-dd8c-4ce1-9a54-c600c760b4bb" containerID="bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad" exitCode=0 Apr 24 21:43:45.949537 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:43:45.949514 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" event={"ID":"402562a4-dd8c-4ce1-9a54-c600c760b4bb","Type":"ContainerDied","Data":"bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad"} Apr 24 21:43:46.956901 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:43:46.956864 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" event={"ID":"402562a4-dd8c-4ce1-9a54-c600c760b4bb","Type":"ContainerStarted","Data":"c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f"} Apr 24 21:43:46.981214 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:43:46.981164 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" podStartSLOduration=69.981149354 podStartE2EDuration="1m9.981149354s" podCreationTimestamp="2026-04-24 21:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:43:46.979057032 +0000 UTC m=+965.077469546" watchObservedRunningTime="2026-04-24 21:43:46.981149354 +0000 UTC m=+965.079561927" Apr 24 21:43:47.361611 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:43:47.361498 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:43:47.361611 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:43:47.361546 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:43:47.373984 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:43:47.373957 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:43:47.972152 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:43:47.972122 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:44:11.303227 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.303153 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9"] Apr 24 21:44:11.303663 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.303426 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" podUID="402562a4-dd8c-4ce1-9a54-c600c760b4bb" containerName="main" containerID="cri-o://c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f" gracePeriod=30 Apr 24 21:44:11.552331 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.552309 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:44:11.676987 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.676958 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-dshm\") pod \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " Apr 24 21:44:11.676987 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.676990 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kserve-provision-location\") pod \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " Apr 24 21:44:11.677235 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677013 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tls-certs\") pod \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " Apr 24 21:44:11.677235 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677033 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tmp-dir\") pod \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " Apr 24 21:44:11.677235 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677088 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-model-cache\") pod \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " Apr 24 21:44:11.677235 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677115 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-home\") pod \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " Apr 24 21:44:11.677235 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677141 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4fr8\" (UniqueName: \"kubernetes.io/projected/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kube-api-access-n4fr8\") pod \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\" (UID: \"402562a4-dd8c-4ce1-9a54-c600c760b4bb\") " Apr 24 21:44:11.677497 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677362 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "402562a4-dd8c-4ce1-9a54-c600c760b4bb" (UID: "402562a4-dd8c-4ce1-9a54-c600c760b4bb"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:11.677497 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677356 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-model-cache" (OuterVolumeSpecName: "model-cache") pod "402562a4-dd8c-4ce1-9a54-c600c760b4bb" (UID: "402562a4-dd8c-4ce1-9a54-c600c760b4bb"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:11.677497 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677383 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-home" (OuterVolumeSpecName: "home") pod "402562a4-dd8c-4ce1-9a54-c600c760b4bb" (UID: "402562a4-dd8c-4ce1-9a54-c600c760b4bb"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:11.677661 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677502 2567 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tmp-dir\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:44:11.677661 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677524 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-model-cache\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:44:11.677661 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.677540 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-home\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:44:11.679238 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.679205 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kube-api-access-n4fr8" (OuterVolumeSpecName: "kube-api-access-n4fr8") pod "402562a4-dd8c-4ce1-9a54-c600c760b4bb" (UID: "402562a4-dd8c-4ce1-9a54-c600c760b4bb"). InnerVolumeSpecName "kube-api-access-n4fr8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:44:11.679646 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.679630 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "402562a4-dd8c-4ce1-9a54-c600c760b4bb" (UID: "402562a4-dd8c-4ce1-9a54-c600c760b4bb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:44:11.679722 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.679645 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-dshm" (OuterVolumeSpecName: "dshm") pod "402562a4-dd8c-4ce1-9a54-c600c760b4bb" (UID: "402562a4-dd8c-4ce1-9a54-c600c760b4bb"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:11.734073 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.734035 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "402562a4-dd8c-4ce1-9a54-c600c760b4bb" (UID: "402562a4-dd8c-4ce1-9a54-c600c760b4bb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:11.778914 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.778885 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n4fr8\" (UniqueName: \"kubernetes.io/projected/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kube-api-access-n4fr8\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:44:11.778914 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.778911 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-dshm\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:44:11.778914 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.778922 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/402562a4-dd8c-4ce1-9a54-c600c760b4bb-kserve-provision-location\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:44:11.779247 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:11.778933 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/402562a4-dd8c-4ce1-9a54-c600c760b4bb-tls-certs\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:44:12.053611 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.053505 2567 generic.go:358] "Generic (PLEG): container finished" podID="402562a4-dd8c-4ce1-9a54-c600c760b4bb" containerID="c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f" exitCode=0 Apr 24 21:44:12.053611 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.053569 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" Apr 24 21:44:12.053611 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.053577 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" event={"ID":"402562a4-dd8c-4ce1-9a54-c600c760b4bb","Type":"ContainerDied","Data":"c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f"} Apr 24 21:44:12.053870 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.053632 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9" event={"ID":"402562a4-dd8c-4ce1-9a54-c600c760b4bb","Type":"ContainerDied","Data":"5d69e97d20a7d135b4f9a48337604f26200a62c75627144265576159fda010c0"} Apr 24 21:44:12.053870 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.053648 2567 scope.go:117] "RemoveContainer" containerID="c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f" Apr 24 21:44:12.064520 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.064499 2567 scope.go:117] "RemoveContainer" containerID="bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad" Apr 24 21:44:12.080350 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.080330 2567 scope.go:117] "RemoveContainer" containerID="c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f" Apr 24 21:44:12.081049 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:44:12.081021 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f\": container with ID starting with c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f not found: ID does not exist" containerID="c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f" Apr 24 21:44:12.081133 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.081058 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f"} err="failed to get container status \"c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f\": rpc error: code = NotFound desc = could not find container \"c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f\": container with ID starting with c1d6e30718e72e4275051f677f42a5e0182ba1a06fd537e4b2ee1de24e30219f not found: ID does not exist" Apr 24 21:44:12.081133 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.081078 2567 scope.go:117] "RemoveContainer" containerID="bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad" Apr 24 21:44:12.081378 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:44:12.081355 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad\": container with ID starting with bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad not found: ID does not exist" containerID="bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad" Apr 24 21:44:12.081436 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.081387 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad"} err="failed to get container status \"bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad\": rpc error: code = NotFound desc = could not find container \"bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad\": container with ID starting with bd68d05da8f02b0b9955fba8e3966dfb9efa751a60fc91991803cafaee3cf9ad not found: ID does not exist" Apr 24 21:44:12.084080 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.084060 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9"] Apr 24 21:44:12.089511 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.089491 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7ffc767776-r8gp9"] Apr 24 21:44:12.417276 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:12.417245 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402562a4-dd8c-4ce1-9a54-c600c760b4bb" path="/var/lib/kubelet/pods/402562a4-dd8c-4ce1-9a54-c600c760b4bb/volumes" Apr 24 21:44:38.207499 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.207456 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc"] Apr 24 21:44:38.208000 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.207920 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="402562a4-dd8c-4ce1-9a54-c600c760b4bb" containerName="storage-initializer" Apr 24 21:44:38.208000 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.207943 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="402562a4-dd8c-4ce1-9a54-c600c760b4bb" containerName="storage-initializer" Apr 24 21:44:38.208000 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.207954 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="402562a4-dd8c-4ce1-9a54-c600c760b4bb" containerName="main" Apr 24 21:44:38.208000 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.207960 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="402562a4-dd8c-4ce1-9a54-c600c760b4bb" containerName="main" Apr 24 21:44:38.208193 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.208010 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="402562a4-dd8c-4ce1-9a54-c600c760b4bb" containerName="main" Apr 24 21:44:38.210146 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.210119 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.212864 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.212845 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:44:38.213673 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.213652 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 24 21:44:38.221522 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.221499 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc"] Apr 24 21:44:38.287846 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.287816 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-home\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.287846 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.287849 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.288039 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.287924 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-model-cache\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.288039 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.287957 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-dshm\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.288039 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.287998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.288039 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.288023 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddztp\" (UniqueName: \"kubernetes.io/projected/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kube-api-access-ddztp\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.288205 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.288058 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.388804 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.388765 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-home\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.388804 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.388807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.389056 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.388862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-model-cache\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.389056 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.388887 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-dshm\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.389056 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.388935 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.389056 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.388969 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddztp\" (UniqueName: \"kubernetes.io/projected/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kube-api-access-ddztp\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.389056 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.389004 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.389314 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.389181 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-home\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.389314 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.389243 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-model-cache\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.389421 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.389319 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.389421 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.389366 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.391157 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.391131 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-dshm\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.391436 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.391415 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.398628 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.398573 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddztp\" (UniqueName: \"kubernetes.io/projected/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kube-api-access-ddztp\") pod \"scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.521562 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.521481 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:38.649434 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.649407 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc"] Apr 24 21:44:38.652872 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:44:38.652825 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bc8893f_774e_4c75_95e8_fbf39884d0a7.slice/crio-e03c74a7515a108fc5983d796acd3e5a984c534d5f9bb629919c1d534188d71c WatchSource:0}: Error finding container e03c74a7515a108fc5983d796acd3e5a984c534d5f9bb629919c1d534188d71c: Status 404 returned error can't find the container with id e03c74a7515a108fc5983d796acd3e5a984c534d5f9bb629919c1d534188d71c Apr 24 21:44:38.654801 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:38.654771 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:44:39.164027 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:39.163993 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" event={"ID":"6bc8893f-774e-4c75-95e8-fbf39884d0a7","Type":"ContainerStarted","Data":"d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a"} Apr 24 21:44:39.164027 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:39.164034 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" event={"ID":"6bc8893f-774e-4c75-95e8-fbf39884d0a7","Type":"ContainerStarted","Data":"e03c74a7515a108fc5983d796acd3e5a984c534d5f9bb629919c1d534188d71c"} Apr 24 21:44:48.200066 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:48.200033 2567 generic.go:358] "Generic (PLEG): container finished" podID="6bc8893f-774e-4c75-95e8-fbf39884d0a7" containerID="d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a" exitCode=0 Apr 24 21:44:48.200473 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:48.200107 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" event={"ID":"6bc8893f-774e-4c75-95e8-fbf39884d0a7","Type":"ContainerDied","Data":"d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a"} Apr 24 21:44:49.210847 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:49.210812 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" event={"ID":"6bc8893f-774e-4c75-95e8-fbf39884d0a7","Type":"ContainerStarted","Data":"92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0"} Apr 24 21:44:49.237754 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:49.237714 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" podStartSLOduration=11.237703014 podStartE2EDuration="11.237703014s" podCreationTimestamp="2026-04-24 21:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:44:49.232862214 +0000 UTC m=+1027.331274739" watchObservedRunningTime="2026-04-24 21:44:49.237703014 +0000 UTC m=+1027.336115527" Apr 24 21:44:58.521782 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:58.521743 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:58.521782 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:58.521789 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:58.534284 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:58.534260 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:44:59.257641 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:44:59.257613 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:45:11.933005 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:11.932966 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc"] Apr 24 21:45:11.933528 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:11.933330 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" podUID="6bc8893f-774e-4c75-95e8-fbf39884d0a7" containerName="main" containerID="cri-o://92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0" gracePeriod=30 Apr 24 21:45:12.188054 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.187986 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:45:12.307710 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.307670 2567 generic.go:358] "Generic (PLEG): container finished" podID="6bc8893f-774e-4c75-95e8-fbf39884d0a7" containerID="92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0" exitCode=0 Apr 24 21:45:12.307882 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.307742 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" Apr 24 21:45:12.307882 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.307755 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" event={"ID":"6bc8893f-774e-4c75-95e8-fbf39884d0a7","Type":"ContainerDied","Data":"92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0"} Apr 24 21:45:12.307882 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.307799 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc" event={"ID":"6bc8893f-774e-4c75-95e8-fbf39884d0a7","Type":"ContainerDied","Data":"e03c74a7515a108fc5983d796acd3e5a984c534d5f9bb629919c1d534188d71c"} Apr 24 21:45:12.307882 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.307815 2567 scope.go:117] "RemoveContainer" containerID="92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0" Apr 24 21:45:12.316361 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.316333 2567 scope.go:117] "RemoveContainer" containerID="d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a" Apr 24 21:45:12.380695 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.380669 2567 scope.go:117] "RemoveContainer" containerID="92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0" Apr 24 21:45:12.380983 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:45:12.380963 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0\": container with ID starting with 92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0 not found: ID does not exist" containerID="92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0" Apr 24 21:45:12.381033 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.380992 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0"} err="failed to get container status \"92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0\": rpc error: code = NotFound desc = could not find container \"92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0\": container with ID starting with 92e97ecc1e01cb9c9438eb53511c6ea43d2b9fa891a7356bf01d4c25d88b2aa0 not found: ID does not exist" Apr 24 21:45:12.381033 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.381013 2567 scope.go:117] "RemoveContainer" containerID="d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a" Apr 24 21:45:12.381273 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:45:12.381251 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a\": container with ID starting with d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a not found: ID does not exist" containerID="d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a" Apr 24 21:45:12.381310 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.381281 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a"} err="failed to get container status \"d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a\": rpc error: code = NotFound desc = could not find container \"d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a\": container with ID starting with d18d600299bd1a7b56f727a3308206025ba3aecae337216ebbc7b2d3f05bc97a not found: ID does not exist" Apr 24 21:45:12.384571 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.384555 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-home\") pod \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " Apr 24 21:45:12.384688 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.384608 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tls-certs\") pod \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " Apr 24 21:45:12.384688 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.384633 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-dshm\") pod \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " Apr 24 21:45:12.384688 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.384648 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kserve-provision-location\") pod \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " Apr 24 21:45:12.384854 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.384697 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tmp-dir\") pod \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " Apr 24 21:45:12.384854 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.384734 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-model-cache\") pod \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " Apr 24 21:45:12.384854 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.384762 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddztp\" (UniqueName: \"kubernetes.io/projected/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kube-api-access-ddztp\") pod \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\" (UID: \"6bc8893f-774e-4c75-95e8-fbf39884d0a7\") " Apr 24 21:45:12.384854 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.384802 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-home" (OuterVolumeSpecName: "home") pod "6bc8893f-774e-4c75-95e8-fbf39884d0a7" (UID: "6bc8893f-774e-4c75-95e8-fbf39884d0a7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:12.385073 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.384980 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-home\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.385073 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.385021 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6bc8893f-774e-4c75-95e8-fbf39884d0a7" (UID: "6bc8893f-774e-4c75-95e8-fbf39884d0a7"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:12.385073 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.385039 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-model-cache" (OuterVolumeSpecName: "model-cache") pod "6bc8893f-774e-4c75-95e8-fbf39884d0a7" (UID: "6bc8893f-774e-4c75-95e8-fbf39884d0a7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:12.386768 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.386717 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6bc8893f-774e-4c75-95e8-fbf39884d0a7" (UID: "6bc8893f-774e-4c75-95e8-fbf39884d0a7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:12.386874 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.386788 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-dshm" (OuterVolumeSpecName: "dshm") pod "6bc8893f-774e-4c75-95e8-fbf39884d0a7" (UID: "6bc8893f-774e-4c75-95e8-fbf39884d0a7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:12.386874 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.386826 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kube-api-access-ddztp" (OuterVolumeSpecName: "kube-api-access-ddztp") pod "6bc8893f-774e-4c75-95e8-fbf39884d0a7" (UID: "6bc8893f-774e-4c75-95e8-fbf39884d0a7"). InnerVolumeSpecName "kube-api-access-ddztp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:12.444947 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.444851 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6bc8893f-774e-4c75-95e8-fbf39884d0a7" (UID: "6bc8893f-774e-4c75-95e8-fbf39884d0a7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:12.486156 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.486131 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tls-certs\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.486156 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.486158 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-dshm\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.486334 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.486173 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kserve-provision-location\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.486334 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.486183 2567 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-tmp-dir\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.486334 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.486192 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6bc8893f-774e-4c75-95e8-fbf39884d0a7-model-cache\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.486334 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.486200 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddztp\" (UniqueName: \"kubernetes.io/projected/6bc8893f-774e-4c75-95e8-fbf39884d0a7-kube-api-access-ddztp\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.631262 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.631145 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc"] Apr 24 21:45:12.633553 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:12.633529 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-69bd5f69cb-gf8gc"] Apr 24 21:45:14.417926 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:14.417891 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc8893f-774e-4c75-95e8-fbf39884d0a7" path="/var/lib/kubelet/pods/6bc8893f-774e-4c75-95e8-fbf39884d0a7/volumes" Apr 24 21:45:40.988932 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:40.988853 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl"] Apr 24 21:45:40.989353 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:40.989198 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bc8893f-774e-4c75-95e8-fbf39884d0a7" containerName="main" Apr 24 21:45:40.989353 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:40.989209 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc8893f-774e-4c75-95e8-fbf39884d0a7" containerName="main" Apr 24 21:45:40.989353 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:40.989221 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bc8893f-774e-4c75-95e8-fbf39884d0a7" containerName="storage-initializer" Apr 24 21:45:40.989353 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:40.989226 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc8893f-774e-4c75-95e8-fbf39884d0a7" containerName="storage-initializer" Apr 24 21:45:40.989353 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:40.989296 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bc8893f-774e-4c75-95e8-fbf39884d0a7" containerName="main" Apr 24 21:45:40.992434 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:40.992412 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:40.995737 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:40.995712 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 24 21:45:40.995870 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:40.995763 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:45:41.002433 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.002413 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl"] Apr 24 21:45:41.126275 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.126239 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tls-certs\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.126469 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.126295 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-home\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.126469 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.126353 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-model-cache\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.126469 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.126374 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tmp-dir\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.126469 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.126397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.126469 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.126425 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5ssg\" (UniqueName: \"kubernetes.io/projected/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kube-api-access-r5ssg\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.126693 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.126500 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-dshm\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.227714 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.227673 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tls-certs\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.227714 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.227721 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-home\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.227970 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.227742 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-model-cache\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.227970 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.227758 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tmp-dir\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.227970 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.227774 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.227970 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.227796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5ssg\" (UniqueName: \"kubernetes.io/projected/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kube-api-access-r5ssg\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.227970 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.227835 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-dshm\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.228263 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.228149 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-home\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.228263 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.228197 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.228263 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.228205 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-model-cache\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.228263 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.228256 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tmp-dir\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.230105 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.230077 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-dshm\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.230266 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.230248 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tls-certs\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.236704 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.236684 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5ssg\" (UniqueName: \"kubernetes.io/projected/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kube-api-access-r5ssg\") pod \"precise-prefix-cache-test-kserve-659d8476f4-2gsxl\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.304942 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.304862 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:45:41.460290 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:41.460267 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl"] Apr 24 21:45:41.462026 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:45:41.461991 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d7850bd_1e25_4e1c_aae2_fddb2fec4962.slice/crio-665b4476445acbda7f58364aab1de062158a7573c1d09c2cfdf0b096c35f39d4 WatchSource:0}: Error finding container 665b4476445acbda7f58364aab1de062158a7573c1d09c2cfdf0b096c35f39d4: Status 404 returned error can't find the container with id 665b4476445acbda7f58364aab1de062158a7573c1d09c2cfdf0b096c35f39d4 Apr 24 21:45:42.426808 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:42.426766 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" event={"ID":"3d7850bd-1e25-4e1c-aae2-fddb2fec4962","Type":"ContainerStarted","Data":"0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7"} Apr 24 21:45:42.427124 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:45:42.426815 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" event={"ID":"3d7850bd-1e25-4e1c-aae2-fddb2fec4962","Type":"ContainerStarted","Data":"665b4476445acbda7f58364aab1de062158a7573c1d09c2cfdf0b096c35f39d4"} Apr 24 21:47:50.928684 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:47:50.928646 2567 generic.go:358] "Generic (PLEG): container finished" podID="3d7850bd-1e25-4e1c-aae2-fddb2fec4962" containerID="0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7" exitCode=0 Apr 24 21:47:50.929167 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:47:50.928726 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" event={"ID":"3d7850bd-1e25-4e1c-aae2-fddb2fec4962","Type":"ContainerDied","Data":"0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7"} Apr 24 21:47:51.934045 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:47:51.934007 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" event={"ID":"3d7850bd-1e25-4e1c-aae2-fddb2fec4962","Type":"ContainerStarted","Data":"fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09"} Apr 24 21:47:51.957028 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:47:51.956972 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" podStartSLOduration=131.956957388 podStartE2EDuration="2m11.956957388s" podCreationTimestamp="2026-04-24 21:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:51.953180648 +0000 UTC m=+1210.051593166" watchObservedRunningTime="2026-04-24 21:47:51.956957388 +0000 UTC m=+1210.055369929" Apr 24 21:48:01.305699 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:01.305656 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:48:01.305699 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:01.305701 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:48:01.318410 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:01.318381 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:48:01.984660 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:01.984627 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:48:03.445087 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:03.445048 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl"] Apr 24 21:48:03.980476 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:03.980425 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" podUID="3d7850bd-1e25-4e1c-aae2-fddb2fec4962" containerName="main" containerID="cri-o://fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09" gracePeriod=30 Apr 24 21:48:04.230461 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.230436 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:48:04.263369 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263280 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-home\") pod \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " Apr 24 21:48:04.263369 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263314 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-dshm\") pod \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " Apr 24 21:48:04.263369 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263333 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tls-certs\") pod \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " Apr 24 21:48:04.263369 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263358 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-model-cache\") pod \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " Apr 24 21:48:04.263752 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263389 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5ssg\" (UniqueName: \"kubernetes.io/projected/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kube-api-access-r5ssg\") pod \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " Apr 24 21:48:04.263752 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263418 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kserve-provision-location\") pod \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " Apr 24 21:48:04.263752 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263449 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tmp-dir\") pod \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\" (UID: \"3d7850bd-1e25-4e1c-aae2-fddb2fec4962\") " Apr 24 21:48:04.263752 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263609 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-home" (OuterVolumeSpecName: "home") pod "3d7850bd-1e25-4e1c-aae2-fddb2fec4962" (UID: "3d7850bd-1e25-4e1c-aae2-fddb2fec4962"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:04.263752 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263660 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-model-cache" (OuterVolumeSpecName: "model-cache") pod "3d7850bd-1e25-4e1c-aae2-fddb2fec4962" (UID: "3d7850bd-1e25-4e1c-aae2-fddb2fec4962"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:04.264028 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263822 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-home\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:48:04.264028 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263842 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-model-cache\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:48:04.264028 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.263837 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "3d7850bd-1e25-4e1c-aae2-fddb2fec4962" (UID: "3d7850bd-1e25-4e1c-aae2-fddb2fec4962"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:04.265508 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.265476 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-dshm" (OuterVolumeSpecName: "dshm") pod "3d7850bd-1e25-4e1c-aae2-fddb2fec4962" (UID: "3d7850bd-1e25-4e1c-aae2-fddb2fec4962"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:04.265690 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.265666 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3d7850bd-1e25-4e1c-aae2-fddb2fec4962" (UID: "3d7850bd-1e25-4e1c-aae2-fddb2fec4962"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:04.265793 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.265773 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kube-api-access-r5ssg" (OuterVolumeSpecName: "kube-api-access-r5ssg") pod "3d7850bd-1e25-4e1c-aae2-fddb2fec4962" (UID: "3d7850bd-1e25-4e1c-aae2-fddb2fec4962"). InnerVolumeSpecName "kube-api-access-r5ssg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:04.322819 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.322778 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3d7850bd-1e25-4e1c-aae2-fddb2fec4962" (UID: "3d7850bd-1e25-4e1c-aae2-fddb2fec4962"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:04.364228 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.364198 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-dshm\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:48:04.364228 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.364224 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tls-certs\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:48:04.364228 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.364235 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5ssg\" (UniqueName: \"kubernetes.io/projected/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kube-api-access-r5ssg\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:48:04.364462 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.364244 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-kserve-provision-location\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:48:04.364462 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.364275 2567 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d7850bd-1e25-4e1c-aae2-fddb2fec4962-tmp-dir\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 21:48:04.985007 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.984970 2567 generic.go:358] "Generic (PLEG): container finished" podID="3d7850bd-1e25-4e1c-aae2-fddb2fec4962" containerID="fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09" exitCode=0 Apr 24 21:48:04.985431 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.985046 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" Apr 24 21:48:04.985431 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.985049 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" event={"ID":"3d7850bd-1e25-4e1c-aae2-fddb2fec4962","Type":"ContainerDied","Data":"fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09"} Apr 24 21:48:04.985431 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.985091 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl" event={"ID":"3d7850bd-1e25-4e1c-aae2-fddb2fec4962","Type":"ContainerDied","Data":"665b4476445acbda7f58364aab1de062158a7573c1d09c2cfdf0b096c35f39d4"} Apr 24 21:48:04.985431 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.985108 2567 scope.go:117] "RemoveContainer" containerID="fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09" Apr 24 21:48:04.993276 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:04.993222 2567 scope.go:117] "RemoveContainer" containerID="0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7" Apr 24 21:48:05.006097 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:05.006073 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl"] Apr 24 21:48:05.010679 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:05.010655 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-2gsxl"] Apr 24 21:48:05.060701 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:05.060674 2567 scope.go:117] "RemoveContainer" containerID="fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09" Apr 24 21:48:05.061058 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:48:05.061032 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09\": container with ID starting with fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09 not found: ID does not exist" containerID="fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09" Apr 24 21:48:05.061172 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:05.061063 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09"} err="failed to get container status \"fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09\": rpc error: code = NotFound desc = could not find container \"fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09\": container with ID starting with fdb60ca44637c68e95b0cc0be37255a66a9ef200b32eff67c1c3100277d97c09 not found: ID does not exist" Apr 24 21:48:05.061172 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:05.061084 2567 scope.go:117] "RemoveContainer" containerID="0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7" Apr 24 21:48:05.061385 ip-10-0-142-242 kubenswrapper[2567]: E0424 21:48:05.061367 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7\": container with ID starting with 0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7 not found: ID does not exist" containerID="0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7" Apr 24 21:48:05.061433 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:05.061391 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7"} err="failed to get container status \"0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7\": rpc error: code = NotFound desc = could not find container \"0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7\": container with ID starting with 0db1187170a1270082aabbeba6ec0f8fd52cb9ed73f6f4a472d9f0cd409739e7 not found: ID does not exist" Apr 24 21:48:06.418312 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:48:06.418283 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7850bd-1e25-4e1c-aae2-fddb2fec4962" path="/var/lib/kubelet/pods/3d7850bd-1e25-4e1c-aae2-fddb2fec4962/volumes" Apr 24 21:52:29.162276 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.162233 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6ff58d5594-fk9lc"] Apr 24 21:52:29.162800 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.162633 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d7850bd-1e25-4e1c-aae2-fddb2fec4962" containerName="storage-initializer" Apr 24 21:52:29.162800 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.162646 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7850bd-1e25-4e1c-aae2-fddb2fec4962" containerName="storage-initializer" Apr 24 21:52:29.162800 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.162659 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d7850bd-1e25-4e1c-aae2-fddb2fec4962" containerName="main" Apr 24 21:52:29.162800 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.162665 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7850bd-1e25-4e1c-aae2-fddb2fec4962" containerName="main" Apr 24 21:52:29.162800 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.162728 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d7850bd-1e25-4e1c-aae2-fddb2fec4962" containerName="main" Apr 24 21:52:29.165713 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.165697 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" Apr 24 21:52:29.168839 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.168818 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-8gbxr\"" Apr 24 21:52:29.168960 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.168880 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:52:29.175611 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.175571 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6ff58d5594-fk9lc"] Apr 24 21:52:29.294227 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.294184 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5dn\" (UniqueName: \"kubernetes.io/projected/2b925942-e3d0-42fc-9b55-537e7da940f4-kube-api-access-lt5dn\") pod \"llmisvc-controller-manager-6ff58d5594-fk9lc\" (UID: \"2b925942-e3d0-42fc-9b55-537e7da940f4\") " pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" Apr 24 21:52:29.294412 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.294240 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b925942-e3d0-42fc-9b55-537e7da940f4-cert\") pod \"llmisvc-controller-manager-6ff58d5594-fk9lc\" (UID: \"2b925942-e3d0-42fc-9b55-537e7da940f4\") " pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" Apr 24 21:52:29.395159 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.395122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5dn\" (UniqueName: \"kubernetes.io/projected/2b925942-e3d0-42fc-9b55-537e7da940f4-kube-api-access-lt5dn\") pod \"llmisvc-controller-manager-6ff58d5594-fk9lc\" (UID: \"2b925942-e3d0-42fc-9b55-537e7da940f4\") " pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" Apr 24 21:52:29.395367 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.395174 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b925942-e3d0-42fc-9b55-537e7da940f4-cert\") pod \"llmisvc-controller-manager-6ff58d5594-fk9lc\" (UID: \"2b925942-e3d0-42fc-9b55-537e7da940f4\") " pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" Apr 24 21:52:29.397417 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.397396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b925942-e3d0-42fc-9b55-537e7da940f4-cert\") pod \"llmisvc-controller-manager-6ff58d5594-fk9lc\" (UID: \"2b925942-e3d0-42fc-9b55-537e7da940f4\") " pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" Apr 24 21:52:29.404114 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.404093 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5dn\" (UniqueName: \"kubernetes.io/projected/2b925942-e3d0-42fc-9b55-537e7da940f4-kube-api-access-lt5dn\") pod \"llmisvc-controller-manager-6ff58d5594-fk9lc\" (UID: \"2b925942-e3d0-42fc-9b55-537e7da940f4\") " pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" Apr 24 21:52:29.476168 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.476073 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" Apr 24 21:52:29.599395 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.599371 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6ff58d5594-fk9lc"] Apr 24 21:52:29.602060 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:52:29.602033 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2b925942_e3d0_42fc_9b55_537e7da940f4.slice/crio-8bb51c5aebc039c7341d5f22095f7d95e82908b793bf2db8056f6e1edc87a95e WatchSource:0}: Error finding container 8bb51c5aebc039c7341d5f22095f7d95e82908b793bf2db8056f6e1edc87a95e: Status 404 returned error can't find the container with id 8bb51c5aebc039c7341d5f22095f7d95e82908b793bf2db8056f6e1edc87a95e Apr 24 21:52:29.603416 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:29.603397 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:52:30.022677 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:30.022638 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" event={"ID":"2b925942-e3d0-42fc-9b55-537e7da940f4","Type":"ContainerStarted","Data":"8bb51c5aebc039c7341d5f22095f7d95e82908b793bf2db8056f6e1edc87a95e"} Apr 24 21:52:33.036048 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:33.036010 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" event={"ID":"2b925942-e3d0-42fc-9b55-537e7da940f4","Type":"ContainerStarted","Data":"61ede3c3fe463918ffb3bf8ecea42a8b3a96764cd550c3ca643491a473f8bb8e"} Apr 24 21:52:33.036449 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:33.036107 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" Apr 24 21:52:33.055780 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:52:33.055734 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" podStartSLOduration=1.023024524 podStartE2EDuration="4.055720936s" podCreationTimestamp="2026-04-24 21:52:29 +0000 UTC" firstStartedPulling="2026-04-24 21:52:29.603609346 +0000 UTC m=+1487.702021836" lastFinishedPulling="2026-04-24 21:52:32.636305743 +0000 UTC m=+1490.734718248" observedRunningTime="2026-04-24 21:52:33.053485198 +0000 UTC m=+1491.151897711" watchObservedRunningTime="2026-04-24 21:52:33.055720936 +0000 UTC m=+1491.154133448" Apr 24 21:53:04.042165 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:53:04.042093 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6ff58d5594-fk9lc" Apr 24 21:58:00.234205 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.234170 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 21:58:00.236646 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.236629 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.239606 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.239568 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-2xflz\"" Apr 24 21:58:00.239976 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.239959 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:58:00.240830 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.240810 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 24 21:58:00.247068 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.247046 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 21:58:00.334165 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.334134 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.334332 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.334182 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.334332 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.334218 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sh89\" (UniqueName: \"kubernetes.io/projected/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kube-api-access-2sh89\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.334332 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.334250 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.334332 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.334287 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.334332 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.334317 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.334508 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.334338 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435105 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435139 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435139 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sh89\" (UniqueName: \"kubernetes.io/projected/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kube-api-access-2sh89\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435357 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435172 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435357 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435204 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435357 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435234 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435357 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435254 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435357 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435285 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435622 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435522 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435751 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435681 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435804 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435740 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.435861 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.435811 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.437773 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.437752 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.437900 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.437886 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.444623 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.444601 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sh89\" (UniqueName: \"kubernetes.io/projected/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kube-api-access-2sh89\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.547801 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.547725 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 21:58:00.683441 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.683415 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 21:58:00.685053 ip-10-0-142-242 kubenswrapper[2567]: W0424 21:58:00.685018 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cf01be2_27a7_4962_9b5a_d193c0b8d8f9.slice/crio-99868999d9ed8bed42a48b81bf6b954965d8872c3036adcc5e5890bfc5b521f1 WatchSource:0}: Error finding container 99868999d9ed8bed42a48b81bf6b954965d8872c3036adcc5e5890bfc5b521f1: Status 404 returned error can't find the container with id 99868999d9ed8bed42a48b81bf6b954965d8872c3036adcc5e5890bfc5b521f1 Apr 24 21:58:00.687443 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:00.687426 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:58:01.285267 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:01.285233 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9","Type":"ContainerStarted","Data":"0f4bfa6745ba24c10a9ba0abcf7240cb2e78ea693c97d8bb2083608c526dac80"} Apr 24 21:58:01.285267 ip-10-0-142-242 kubenswrapper[2567]: I0424 21:58:01.285272 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9","Type":"ContainerStarted","Data":"99868999d9ed8bed42a48b81bf6b954965d8872c3036adcc5e5890bfc5b521f1"} Apr 24 22:01:05.969832 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:01:05.969800 2567 generic.go:358] "Generic (PLEG): container finished" podID="6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" containerID="0f4bfa6745ba24c10a9ba0abcf7240cb2e78ea693c97d8bb2083608c526dac80" exitCode=0 Apr 24 22:01:05.970190 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:01:05.969860 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9","Type":"ContainerDied","Data":"0f4bfa6745ba24c10a9ba0abcf7240cb2e78ea693c97d8bb2083608c526dac80"} Apr 24 22:01:34.113660 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:01:34.113550 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9","Type":"ContainerStarted","Data":"810eab884dfe66aa54c66491ae402faaff6970b873c33522bd2eb5815e441376"} Apr 24 22:01:34.133684 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:01:34.133562 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=186.374849285 podStartE2EDuration="3m34.133543445s" podCreationTimestamp="2026-04-24 21:58:00 +0000 UTC" firstStartedPulling="2026-04-24 22:01:05.971068533 +0000 UTC m=+2004.069481024" lastFinishedPulling="2026-04-24 22:01:33.729762693 +0000 UTC m=+2031.828175184" observedRunningTime="2026-04-24 22:01:34.132293248 +0000 UTC m=+2032.230705762" watchObservedRunningTime="2026-04-24 22:01:34.133543445 +0000 UTC m=+2032.231955959" Apr 24 22:11:24.427078 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.426989 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj"] Apr 24 22:11:24.430879 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.430854 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.433464 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.433441 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-ptmq9\"" Apr 24 22:11:24.461153 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.461125 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj"] Apr 24 22:11:24.553625 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.553564 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.553625 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.553625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn426\" (UniqueName: \"kubernetes.io/projected/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-kube-api-access-nn426\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.553880 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.553664 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.553880 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.553697 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.553880 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.553740 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.553880 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.553795 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.553880 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.553837 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.553880 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.553868 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.554190 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.553886 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655065 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655031 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655266 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655081 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655266 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655119 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655266 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655141 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655266 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655170 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655266 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655209 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn426\" (UniqueName: \"kubernetes.io/projected/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-kube-api-access-nn426\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655266 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655254 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655610 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655610 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655325 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655783 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655720 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.655968 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655940 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.656047 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.656047 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.655993 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.656166 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.656143 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.657833 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.657807 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.658211 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.658193 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.663916 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.663891 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.666836 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.666815 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn426\" (UniqueName: \"kubernetes.io/projected/ae72a1e5-1279-405e-90cc-0be35c5c3f2b-kube-api-access-nn426\") pod \"router-gateway-2-openshift-default-6866b85949-z7zjj\" (UID: \"ae72a1e5-1279-405e-90cc-0be35c5c3f2b\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.744639 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.744531 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:24.899785 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.899743 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj"] Apr 24 22:11:24.901952 ip-10-0-142-242 kubenswrapper[2567]: W0424 22:11:24.901927 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae72a1e5_1279_405e_90cc_0be35c5c3f2b.slice/crio-ce3a2e96b01e8c0e2965360f93844d43a3b97d075a16e8df67d459c151a99da2 WatchSource:0}: Error finding container ce3a2e96b01e8c0e2965360f93844d43a3b97d075a16e8df67d459c151a99da2: Status 404 returned error can't find the container with id ce3a2e96b01e8c0e2965360f93844d43a3b97d075a16e8df67d459c151a99da2 Apr 24 22:11:24.903884 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.903866 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:11:24.904218 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.904194 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 22:11:24.904288 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.904251 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 22:11:24.904288 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:24.904282 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 22:11:25.407011 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:25.406978 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" event={"ID":"ae72a1e5-1279-405e-90cc-0be35c5c3f2b","Type":"ContainerStarted","Data":"79d2bf936bf19684bc0b92c50e82f706029f4d00c423c928e614343eb0b5848d"} Apr 24 22:11:25.407011 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:25.407014 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" event={"ID":"ae72a1e5-1279-405e-90cc-0be35c5c3f2b","Type":"ContainerStarted","Data":"ce3a2e96b01e8c0e2965360f93844d43a3b97d075a16e8df67d459c151a99da2"} Apr 24 22:11:25.460234 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:25.460165 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" podStartSLOduration=1.460148227 podStartE2EDuration="1.460148227s" podCreationTimestamp="2026-04-24 22:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:11:25.457889331 +0000 UTC m=+2623.556301844" watchObservedRunningTime="2026-04-24 22:11:25.460148227 +0000 UTC m=+2623.558560742" Apr 24 22:11:25.745544 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:25.745459 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:25.746997 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:25.746965 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" podUID="ae72a1e5-1279-405e-90cc-0be35c5c3f2b" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.54:15021/healthz/ready\": dial tcp 10.133.0.54:15021: connect: connection refused" Apr 24 22:11:26.749893 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:26.749864 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:27.419818 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:27.419782 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:27.493703 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:27.493670 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-z7zjj" Apr 24 22:11:36.101079 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.101042 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s"] Apr 24 22:11:36.106755 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.106731 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.109275 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.109248 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-6qxss\"" Apr 24 22:11:36.109275 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.109269 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 24 22:11:36.115756 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.115735 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s"] Apr 24 22:11:36.130557 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.130529 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk"] Apr 24 22:11:36.135087 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.135066 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.147132 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.147106 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk"] Apr 24 22:11:36.190711 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190685 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6bh2\" (UniqueName: \"kubernetes.io/projected/6c6a183a-a7cb-4e66-9315-90710ff8a582-kube-api-access-j6bh2\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.190834 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190715 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tls-certs\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.190834 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190738 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-home\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.190834 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190779 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.190940 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190831 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.190940 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190863 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6a183a-a7cb-4e66-9315-90710ff8a582-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.190940 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190886 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.190940 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190904 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-home\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.190940 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190925 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.191154 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190941 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7ldk\" (UniqueName: \"kubernetes.io/projected/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kube-api-access-m7ldk\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.191154 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.190975 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tmp-dir\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.191154 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.191007 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.191154 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.191025 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-model-cache\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.191154 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.191042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-dshm\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.292262 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292223 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-dshm\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.292419 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292377 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6bh2\" (UniqueName: \"kubernetes.io/projected/6c6a183a-a7cb-4e66-9315-90710ff8a582-kube-api-access-j6bh2\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.292419 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292405 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tls-certs\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.292496 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292435 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-home\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.292496 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292478 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.292620 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292518 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.292620 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292553 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6a183a-a7cb-4e66-9315-90710ff8a582-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.292620 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292610 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.292818 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292638 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-home\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.293059 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292912 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.293059 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292912 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-home\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.293059 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.292953 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-home\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.293059 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.293017 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.293059 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.293055 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7ldk\" (UniqueName: \"kubernetes.io/projected/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kube-api-access-m7ldk\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.293402 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.293095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tmp-dir\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.293402 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.293124 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.293402 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.293152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-model-cache\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.293402 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.293191 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.293402 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.293357 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.293710 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.293457 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tmp-dir\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.293710 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.293483 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-model-cache\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.293710 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.293620 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.295114 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.295084 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-dshm\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.295550 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.295528 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6a183a-a7cb-4e66-9315-90710ff8a582-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.295618 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.295572 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tls-certs\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.295916 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.295900 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.304965 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.304936 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6bh2\" (UniqueName: \"kubernetes.io/projected/6c6a183a-a7cb-4e66-9315-90710ff8a582-kube-api-access-j6bh2\") pod \"router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.305211 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.305191 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7ldk\" (UniqueName: \"kubernetes.io/projected/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kube-api-access-m7ldk\") pod \"router-with-refs-pd-test-kserve-5f6d44c486-h2n7s\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.418021 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.417986 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:36.448820 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.448791 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:36.573759 ip-10-0-142-242 kubenswrapper[2567]: W0424 22:11:36.573720 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3f0a542_ad41_48a7_ab9b_45a2926944ca.slice/crio-80d01f093c733f266bb6cfcbb3992007941e6cd6441ff30ca00138443a00894b WatchSource:0}: Error finding container 80d01f093c733f266bb6cfcbb3992007941e6cd6441ff30ca00138443a00894b: Status 404 returned error can't find the container with id 80d01f093c733f266bb6cfcbb3992007941e6cd6441ff30ca00138443a00894b Apr 24 22:11:36.574033 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.574007 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s"] Apr 24 22:11:36.600451 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:36.600426 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk"] Apr 24 22:11:36.602246 ip-10-0-142-242 kubenswrapper[2567]: W0424 22:11:36.602207 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c6a183a_a7cb_4e66_9315_90710ff8a582.slice/crio-ba9156d87e0ca590d6becdf8db85df62d3fdd026b728bf83b6c33b59a46184b4 WatchSource:0}: Error finding container ba9156d87e0ca590d6becdf8db85df62d3fdd026b728bf83b6c33b59a46184b4: Status 404 returned error can't find the container with id ba9156d87e0ca590d6becdf8db85df62d3fdd026b728bf83b6c33b59a46184b4 Apr 24 22:11:37.456999 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:37.456968 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" event={"ID":"b3f0a542-ad41-48a7-ab9b-45a2926944ca","Type":"ContainerStarted","Data":"80d01f093c733f266bb6cfcbb3992007941e6cd6441ff30ca00138443a00894b"} Apr 24 22:11:37.458243 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:37.458217 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" event={"ID":"6c6a183a-a7cb-4e66-9315-90710ff8a582","Type":"ContainerStarted","Data":"547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283"} Apr 24 22:11:37.458329 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:37.458251 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" event={"ID":"6c6a183a-a7cb-4e66-9315-90710ff8a582","Type":"ContainerStarted","Data":"ba9156d87e0ca590d6becdf8db85df62d3fdd026b728bf83b6c33b59a46184b4"} Apr 24 22:11:38.463650 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:38.463613 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" event={"ID":"b3f0a542-ad41-48a7-ab9b-45a2926944ca","Type":"ContainerStarted","Data":"b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384"} Apr 24 22:11:38.464066 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:38.463724 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:39.469999 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:39.469964 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" event={"ID":"b3f0a542-ad41-48a7-ab9b-45a2926944ca","Type":"ContainerStarted","Data":"a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8"} Apr 24 22:11:41.478807 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:41.478775 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerID="547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283" exitCode=0 Apr 24 22:11:41.479192 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:41.478845 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" event={"ID":"6c6a183a-a7cb-4e66-9315-90710ff8a582","Type":"ContainerDied","Data":"547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283"} Apr 24 22:11:42.484872 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:42.484838 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" event={"ID":"6c6a183a-a7cb-4e66-9315-90710ff8a582","Type":"ContainerStarted","Data":"8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf"} Apr 24 22:11:42.517332 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:42.517256 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podStartSLOduration=6.517241308 podStartE2EDuration="6.517241308s" podCreationTimestamp="2026-04-24 22:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:11:42.514258428 +0000 UTC m=+2640.612670978" watchObservedRunningTime="2026-04-24 22:11:42.517241308 +0000 UTC m=+2640.615653821" Apr 24 22:11:46.449562 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:46.449526 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:46.450073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:46.449574 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:11:46.451215 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:46.451182 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 24 22:11:50.488926 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:50.488897 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:11:56.449376 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:11:56.449330 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 24 22:12:06.449187 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:12:06.449144 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 24 22:12:16.449398 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:12:16.449357 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 24 22:12:26.449210 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:12:26.449162 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 24 22:12:36.450003 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:12:36.449962 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 24 22:12:46.449967 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:12:46.449920 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 24 22:12:48.413091 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:12:48.413063 2567 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" secret="" err="secret \"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-2xflz\" not found" Apr 24 22:12:48.566889 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:12:48.566853 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 22:12:48.567095 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:12:48.566961 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs podName:6cf01be2-27a7-4962-9b5a-d193c0b8d8f9 nodeName:}" failed. No retries permitted until 2026-04-24 22:12:49.066934396 +0000 UTC m=+2707.165346904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 22:12:49.071921 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:12:49.071881 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 22:12:49.072115 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:12:49.071960 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs podName:6cf01be2-27a7-4962-9b5a-d193c0b8d8f9 nodeName:}" failed. No retries permitted until 2026-04-24 22:12:50.071944919 +0000 UTC m=+2708.170357410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 22:12:50.082675 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:12:50.082640 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 22:12:50.083052 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:12:50.082719 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs podName:6cf01be2-27a7-4962-9b5a-d193c0b8d8f9 nodeName:}" failed. No retries permitted until 2026-04-24 22:12:52.082705521 +0000 UTC m=+2710.181118012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 22:12:52.103566 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:12:52.103533 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 22:12:52.103970 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:12:52.103617 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs podName:6cf01be2-27a7-4962-9b5a-d193c0b8d8f9 nodeName:}" failed. No retries permitted until 2026-04-24 22:12:56.1036023 +0000 UTC m=+2714.202014791 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 22:12:56.152791 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:12:56.152750 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 22:12:56.153249 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:12:56.152831 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs podName:6cf01be2-27a7-4962-9b5a-d193c0b8d8f9 nodeName:}" failed. No retries permitted until 2026-04-24 22:13:04.152811911 +0000 UTC m=+2722.251224411 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 24 22:12:56.449456 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:12:56.449415 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 24 22:13:00.012042 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.012005 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 22:13:00.012620 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.012279 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" containerName="main" containerID="cri-o://810eab884dfe66aa54c66491ae402faaff6970b873c33522bd2eb5815e441376" gracePeriod=30 Apr 24 22:13:00.809019 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.808971 2567 generic.go:358] "Generic (PLEG): container finished" podID="6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" containerID="810eab884dfe66aa54c66491ae402faaff6970b873c33522bd2eb5815e441376" exitCode=0 Apr 24 22:13:00.809019 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.809006 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9","Type":"ContainerDied","Data":"810eab884dfe66aa54c66491ae402faaff6970b873c33522bd2eb5815e441376"} Apr 24 22:13:00.864933 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.864903 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:13:00.894006 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.893970 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kserve-provision-location\") pod \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " Apr 24 22:13:00.894211 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.894066 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tmp-dir\") pod \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " Apr 24 22:13:00.894211 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.894157 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs\") pod \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " Apr 24 22:13:00.894211 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.894189 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-dshm\") pod \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " Apr 24 22:13:00.894383 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.894250 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-home\") pod \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " Apr 24 22:13:00.894383 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.894273 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sh89\" (UniqueName: \"kubernetes.io/projected/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kube-api-access-2sh89\") pod \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " Apr 24 22:13:00.894383 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.894307 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-model-cache\") pod \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\" (UID: \"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9\") " Apr 24 22:13:00.895006 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.894951 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-model-cache" (OuterVolumeSpecName: "model-cache") pod "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:13:00.895244 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.895214 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-home" (OuterVolumeSpecName: "home") pod "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:13:00.897773 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.897432 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:13:00.897773 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.897534 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-dshm" (OuterVolumeSpecName: "dshm") pod "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:13:00.897773 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.897570 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kube-api-access-2sh89" (OuterVolumeSpecName: "kube-api-access-2sh89") pod "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9"). InnerVolumeSpecName "kube-api-access-2sh89". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:13:00.906033 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.906001 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:13:00.958019 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.957913 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" (UID: "6cf01be2-27a7-4962-9b5a-d193c0b8d8f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:13:00.995711 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.995672 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tls-certs\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:13:00.995711 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.995702 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-dshm\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:13:00.995711 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.995711 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-home\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:13:00.995948 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.995726 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2sh89\" (UniqueName: \"kubernetes.io/projected/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kube-api-access-2sh89\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:13:00.995948 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.995736 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-model-cache\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:13:00.995948 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.995745 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-kserve-provision-location\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:13:00.995948 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:00.995756 2567 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9-tmp-dir\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:13:01.815104 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:01.815072 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:13:01.815562 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:01.815071 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"6cf01be2-27a7-4962-9b5a-d193c0b8d8f9","Type":"ContainerDied","Data":"99868999d9ed8bed42a48b81bf6b954965d8872c3036adcc5e5890bfc5b521f1"} Apr 24 22:13:01.815562 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:01.815201 2567 scope.go:117] "RemoveContainer" containerID="810eab884dfe66aa54c66491ae402faaff6970b873c33522bd2eb5815e441376" Apr 24 22:13:01.825109 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:01.825090 2567 scope.go:117] "RemoveContainer" containerID="0f4bfa6745ba24c10a9ba0abcf7240cb2e78ea693c97d8bb2083608c526dac80" Apr 24 22:13:01.846371 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:01.846345 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 22:13:01.862532 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:01.862492 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 22:13:02.418102 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:02.418068 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" path="/var/lib/kubelet/pods/6cf01be2-27a7-4962-9b5a-d193c0b8d8f9/volumes" Apr 24 22:13:06.449451 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:06.449414 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 24 22:13:16.458861 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:16.458828 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:13:16.466465 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:13:16.466446 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:14:00.044364 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:00.044324 2567 generic.go:358] "Generic (PLEG): container finished" podID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerID="a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8" exitCode=0 Apr 24 22:14:00.044793 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:00.044398 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" event={"ID":"b3f0a542-ad41-48a7-ab9b-45a2926944ca","Type":"ContainerDied","Data":"a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8"} Apr 24 22:14:01.050157 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:01.050119 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" event={"ID":"b3f0a542-ad41-48a7-ab9b-45a2926944ca","Type":"ContainerStarted","Data":"e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb"} Apr 24 22:14:01.077425 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:01.077374 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podStartSLOduration=144.230958462 podStartE2EDuration="2m25.0773589s" podCreationTimestamp="2026-04-24 22:11:36 +0000 UTC" firstStartedPulling="2026-04-24 22:11:36.575706369 +0000 UTC m=+2634.674118860" lastFinishedPulling="2026-04-24 22:11:37.422106807 +0000 UTC m=+2635.520519298" observedRunningTime="2026-04-24 22:14:01.074100601 +0000 UTC m=+2779.172513153" watchObservedRunningTime="2026-04-24 22:14:01.0773589 +0000 UTC m=+2779.175771440" Apr 24 22:14:06.418764 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:06.418732 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:14:06.419291 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:06.418872 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:14:06.419291 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:06.419137 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 24 22:14:16.418568 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:16.418510 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 24 22:14:26.419042 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:26.418986 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 24 22:14:36.418639 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:36.418596 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 24 22:14:46.421939 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:46.421891 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 24 22:14:56.419434 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:14:56.419390 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 24 22:15:06.418818 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:15:06.418762 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 24 22:15:16.418782 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:15:16.418737 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 24 22:15:26.419325 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:15:26.419281 2567 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 24 22:15:36.427745 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:15:36.427663 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:15:36.439900 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:15:36.439874 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:15:48.515113 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:15:48.515081 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk"] Apr 24 22:15:48.515505 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:15:48.515348 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" containerID="cri-o://8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf" gracePeriod=30 Apr 24 22:15:48.524890 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:15:48.524862 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s"] Apr 24 22:15:48.525203 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:15:48.525158 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" containerID="cri-o://e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb" gracePeriod=30 Apr 24 22:16:03.317541 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:03.317502 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:03.330060 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:03.330034 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:03.356410 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:03.356384 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:03.366278 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:03.366252 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:03.380011 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:03.379982 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:03.399106 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:03.399089 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:03.412644 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:03.412619 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:04.470940 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:04.470909 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:04.486454 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:04.486429 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:04.512196 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:04.512171 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:04.521185 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:04.521166 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:04.535008 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:04.534987 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:04.555324 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:04.555300 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:04.566024 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:04.565999 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:05.648236 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:05.648203 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:05.661672 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:05.661645 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:05.685273 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:05.685253 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:05.695263 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:05.695243 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:05.707883 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:05.707861 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:05.727933 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:05.727914 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:05.739270 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:05.739237 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:06.774083 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:06.774055 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:06.787022 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:06.786992 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:06.810470 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:06.810446 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:06.819933 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:06.819915 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:06.833902 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:06.833878 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:06.856885 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:06.856863 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:06.867613 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:06.867593 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:07.888154 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:07.888120 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:07.900793 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:07.900767 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:07.923986 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:07.923960 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:07.933216 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:07.933198 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:07.946552 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:07.946532 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:07.968776 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:07.968755 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:07.983396 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:07.983374 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:09.071083 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:09.071047 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:09.092785 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:09.092763 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:09.116135 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:09.116107 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:09.125903 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:09.125883 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:09.138878 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:09.138863 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:09.157200 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:09.157180 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:09.166903 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:09.166883 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:10.188442 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:10.188417 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:10.201154 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:10.201130 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:10.225807 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:10.225781 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:10.235246 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:10.235225 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:10.248697 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:10.248676 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:10.268449 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:10.268428 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:10.279088 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:10.279070 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:11.351950 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:11.351918 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:11.390112 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:11.390087 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:11.413912 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:11.413890 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:11.423401 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:11.423381 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:11.435966 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:11.435948 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:11.457342 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:11.457325 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:11.468156 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:11.468137 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:12.633404 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:12.633369 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:12.646074 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:12.646042 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:12.680717 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:12.680694 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:12.690203 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:12.690184 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:12.703060 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:12.703036 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:12.725241 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:12.725222 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:12.736453 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:12.736431 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:13.771478 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:13.771450 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:13.785369 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:13.785346 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:13.808140 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:13.808100 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:13.818053 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:13.818035 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:13.831303 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:13.831284 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:13.851669 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:13.851647 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:13.863033 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:13.863015 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:14.905144 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:14.903073 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:14.916282 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:14.916257 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:14.941900 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:14.941877 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:14.954690 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:14.954661 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:14.967938 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:14.967916 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:14.997442 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:14.997418 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:15.013506 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:15.013483 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:16.063460 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:16.063430 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:16.076469 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:16.076446 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:16.101504 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:16.101483 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:16.110629 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:16.110605 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:16.123273 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:16.123255 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:16.142473 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:16.142450 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:16.152973 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:16.152949 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:17.160073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:17.160046 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:17.173097 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:17.173075 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:17.197944 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:17.197922 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:17.213520 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:17.213500 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:17.229516 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:17.229496 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:17.250574 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:17.250543 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:17.262594 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:17.262566 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:18.282922 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.282892 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-ngwj6_7d5accb4-791f-4ad1-abed-f31b4c4de217/istio-proxy/0.log" Apr 24 22:16:18.302675 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.302649 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-z7zjj_ae72a1e5-1279-405e-90cc-0be35c5c3f2b/istio-proxy/0.log" Apr 24 22:16:18.333019 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.332998 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:18.346642 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.346623 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/llm-d-routing-sidecar/0.log" Apr 24 22:16:18.367420 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.367399 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/storage-initializer/0.log" Apr 24 22:16:18.393779 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.393755 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/main/0.log" Apr 24 22:16:18.404252 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.404231 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk_6c6a183a-a7cb-4e66-9315-90710ff8a582/storage-initializer/0.log" Apr 24 22:16:18.525628 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.525567 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="llm-d-routing-sidecar" containerID="cri-o://b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384" gracePeriod=2 Apr 24 22:16:18.777940 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.777916 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:16:18.802414 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.802370 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:18.803237 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.803205 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:16:18.912031 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.911998 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-home\") pod \"6c6a183a-a7cb-4e66-9315-90710ff8a582\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " Apr 24 22:16:18.912216 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912048 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6bh2\" (UniqueName: \"kubernetes.io/projected/6c6a183a-a7cb-4e66-9315-90710ff8a582-kube-api-access-j6bh2\") pod \"6c6a183a-a7cb-4e66-9315-90710ff8a582\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " Apr 24 22:16:18.912216 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912089 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-dshm\") pod \"6c6a183a-a7cb-4e66-9315-90710ff8a582\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " Apr 24 22:16:18.912216 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912105 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-model-cache\") pod \"6c6a183a-a7cb-4e66-9315-90710ff8a582\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " Apr 24 22:16:18.912216 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912120 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7ldk\" (UniqueName: \"kubernetes.io/projected/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kube-api-access-m7ldk\") pod \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " Apr 24 22:16:18.912216 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912145 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kserve-provision-location\") pod \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " Apr 24 22:16:18.912216 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912168 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-kserve-provision-location\") pod \"6c6a183a-a7cb-4e66-9315-90710ff8a582\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " Apr 24 22:16:18.912216 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912188 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-model-cache\") pod \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " Apr 24 22:16:18.912616 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912240 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tls-certs\") pod \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " Apr 24 22:16:18.912616 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912265 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6a183a-a7cb-4e66-9315-90710ff8a582-tls-certs\") pod \"6c6a183a-a7cb-4e66-9315-90710ff8a582\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " Apr 24 22:16:18.912616 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912286 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-tmp-dir\") pod \"6c6a183a-a7cb-4e66-9315-90710ff8a582\" (UID: \"6c6a183a-a7cb-4e66-9315-90710ff8a582\") " Apr 24 22:16:18.912616 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912316 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-dshm\") pod \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " Apr 24 22:16:18.912616 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912352 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-home\") pod \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " Apr 24 22:16:18.912616 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912407 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tmp-dir\") pod \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\" (UID: \"b3f0a542-ad41-48a7-ab9b-45a2926944ca\") " Apr 24 22:16:18.912616 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912418 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-model-cache" (OuterVolumeSpecName: "model-cache") pod "6c6a183a-a7cb-4e66-9315-90710ff8a582" (UID: "6c6a183a-a7cb-4e66-9315-90710ff8a582"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:18.912970 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912761 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-model-cache\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:18.912970 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.912790 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-home" (OuterVolumeSpecName: "home") pod "6c6a183a-a7cb-4e66-9315-90710ff8a582" (UID: "6c6a183a-a7cb-4e66-9315-90710ff8a582"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:18.915092 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.914895 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c6a183a-a7cb-4e66-9315-90710ff8a582-kube-api-access-j6bh2" (OuterVolumeSpecName: "kube-api-access-j6bh2") pod "6c6a183a-a7cb-4e66-9315-90710ff8a582" (UID: "6c6a183a-a7cb-4e66-9315-90710ff8a582"). InnerVolumeSpecName "kube-api-access-j6bh2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:16:18.915092 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.914932 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kube-api-access-m7ldk" (OuterVolumeSpecName: "kube-api-access-m7ldk") pod "b3f0a542-ad41-48a7-ab9b-45a2926944ca" (UID: "b3f0a542-ad41-48a7-ab9b-45a2926944ca"). InnerVolumeSpecName "kube-api-access-m7ldk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:16:18.915092 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.915015 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-dshm" (OuterVolumeSpecName: "dshm") pod "b3f0a542-ad41-48a7-ab9b-45a2926944ca" (UID: "b3f0a542-ad41-48a7-ab9b-45a2926944ca"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:18.915092 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.915015 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b3f0a542-ad41-48a7-ab9b-45a2926944ca" (UID: "b3f0a542-ad41-48a7-ab9b-45a2926944ca"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:16:18.915495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.915422 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-model-cache" (OuterVolumeSpecName: "model-cache") pod "b3f0a542-ad41-48a7-ab9b-45a2926944ca" (UID: "b3f0a542-ad41-48a7-ab9b-45a2926944ca"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:18.915616 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.915517 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-home" (OuterVolumeSpecName: "home") pod "b3f0a542-ad41-48a7-ab9b-45a2926944ca" (UID: "b3f0a542-ad41-48a7-ab9b-45a2926944ca"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:18.916509 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.916487 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-dshm" (OuterVolumeSpecName: "dshm") pod "6c6a183a-a7cb-4e66-9315-90710ff8a582" (UID: "6c6a183a-a7cb-4e66-9315-90710ff8a582"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:18.917084 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.917060 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c6a183a-a7cb-4e66-9315-90710ff8a582-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6c6a183a-a7cb-4e66-9315-90710ff8a582" (UID: "6c6a183a-a7cb-4e66-9315-90710ff8a582"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:16:18.927927 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.927904 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "b3f0a542-ad41-48a7-ab9b-45a2926944ca" (UID: "b3f0a542-ad41-48a7-ab9b-45a2926944ca"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:18.930317 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.930290 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6c6a183a-a7cb-4e66-9315-90710ff8a582" (UID: "6c6a183a-a7cb-4e66-9315-90710ff8a582"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:18.976057 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.976015 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6c6a183a-a7cb-4e66-9315-90710ff8a582" (UID: "6c6a183a-a7cb-4e66-9315-90710ff8a582"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:18.977712 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:18.977686 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b3f0a542-ad41-48a7-ab9b-45a2926944ca" (UID: "b3f0a542-ad41-48a7-ab9b-45a2926944ca"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:19.013304 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013270 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kserve-provision-location\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013304 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013301 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-kserve-provision-location\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013315 2567 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-model-cache\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013328 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tls-certs\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013339 2567 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6a183a-a7cb-4e66-9315-90710ff8a582-tls-certs\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013350 2567 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-tmp-dir\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013363 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-dshm\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013375 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-home\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013386 2567 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b3f0a542-ad41-48a7-ab9b-45a2926944ca-tmp-dir\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013397 2567 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-home\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013411 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j6bh2\" (UniqueName: \"kubernetes.io/projected/6c6a183a-a7cb-4e66-9315-90710ff8a582-kube-api-access-j6bh2\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013422 2567 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6c6a183a-a7cb-4e66-9315-90710ff8a582-dshm\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.013495 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.013436 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m7ldk\" (UniqueName: \"kubernetes.io/projected/b3f0a542-ad41-48a7-ab9b-45a2926944ca-kube-api-access-m7ldk\") on node \"ip-10-0-142-242.ec2.internal\" DevicePath \"\"" Apr 24 22:16:19.410909 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.410877 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-nzp8n_5669e63d-6e50-438f-8114-00a0c65b7492/discovery/0.log" Apr 24 22:16:19.423958 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.423934 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d_2dd33b44-6aab-42e9-bdcf-ee18107e7f04/istio-proxy/0.log" Apr 24 22:16:19.608574 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.608545 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5f6d44c486-h2n7s_b3f0a542-ad41-48a7-ab9b-45a2926944ca/main/0.log" Apr 24 22:16:19.609201 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.609166 2567 generic.go:358] "Generic (PLEG): container finished" podID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerID="e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb" exitCode=137 Apr 24 22:16:19.609201 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.609197 2567 generic.go:358] "Generic (PLEG): container finished" podID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerID="b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384" exitCode=0 Apr 24 22:16:19.609374 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.609218 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" event={"ID":"b3f0a542-ad41-48a7-ab9b-45a2926944ca","Type":"ContainerDied","Data":"e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb"} Apr 24 22:16:19.609374 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.609253 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" Apr 24 22:16:19.609374 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.609265 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" event={"ID":"b3f0a542-ad41-48a7-ab9b-45a2926944ca","Type":"ContainerDied","Data":"b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384"} Apr 24 22:16:19.609374 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.609283 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s" event={"ID":"b3f0a542-ad41-48a7-ab9b-45a2926944ca","Type":"ContainerDied","Data":"80d01f093c733f266bb6cfcbb3992007941e6cd6441ff30ca00138443a00894b"} Apr 24 22:16:19.609374 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.609303 2567 scope.go:117] "RemoveContainer" containerID="e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb" Apr 24 22:16:19.610888 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.610863 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerID="8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf" exitCode=137 Apr 24 22:16:19.611016 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.610906 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" event={"ID":"6c6a183a-a7cb-4e66-9315-90710ff8a582","Type":"ContainerDied","Data":"8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf"} Apr 24 22:16:19.611016 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.610930 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" event={"ID":"6c6a183a-a7cb-4e66-9315-90710ff8a582","Type":"ContainerDied","Data":"ba9156d87e0ca590d6becdf8db85df62d3fdd026b728bf83b6c33b59a46184b4"} Apr 24 22:16:19.611016 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.610965 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk" Apr 24 22:16:19.619388 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.619376 2567 scope.go:117] "RemoveContainer" containerID="a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8" Apr 24 22:16:19.634644 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.634621 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s"] Apr 24 22:16:19.635695 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.635681 2567 scope.go:117] "RemoveContainer" containerID="b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384" Apr 24 22:16:19.641822 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.641801 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5f6d44c486-h2n7s"] Apr 24 22:16:19.645186 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.645164 2567 scope.go:117] "RemoveContainer" containerID="e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb" Apr 24 22:16:19.645467 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:16:19.645446 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb\": container with ID starting with e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb not found: ID does not exist" containerID="e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb" Apr 24 22:16:19.645598 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.645476 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb"} err="failed to get container status \"e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb\": rpc error: code = NotFound desc = could not find container \"e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb\": container with ID starting with e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb not found: ID does not exist" Apr 24 22:16:19.645598 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.645492 2567 scope.go:117] "RemoveContainer" containerID="a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8" Apr 24 22:16:19.645787 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:16:19.645769 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8\": container with ID starting with a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8 not found: ID does not exist" containerID="a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8" Apr 24 22:16:19.645829 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.645793 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8"} err="failed to get container status \"a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8\": rpc error: code = NotFound desc = could not find container \"a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8\": container with ID starting with a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8 not found: ID does not exist" Apr 24 22:16:19.645829 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.645811 2567 scope.go:117] "RemoveContainer" containerID="b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384" Apr 24 22:16:19.646054 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:16:19.646040 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384\": container with ID starting with b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384 not found: ID does not exist" containerID="b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384" Apr 24 22:16:19.646100 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.646058 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384"} err="failed to get container status \"b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384\": rpc error: code = NotFound desc = could not find container \"b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384\": container with ID starting with b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384 not found: ID does not exist" Apr 24 22:16:19.646100 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.646074 2567 scope.go:117] "RemoveContainer" containerID="e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb" Apr 24 22:16:19.646299 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.646275 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb"} err="failed to get container status \"e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb\": rpc error: code = NotFound desc = could not find container \"e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb\": container with ID starting with e562960a7e01bbda454ad7c09f3704c0fde9686de7cc05d7a22bc6193c3b94cb not found: ID does not exist" Apr 24 22:16:19.646372 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.646301 2567 scope.go:117] "RemoveContainer" containerID="a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8" Apr 24 22:16:19.646559 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.646538 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8"} err="failed to get container status \"a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8\": rpc error: code = NotFound desc = could not find container \"a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8\": container with ID starting with a2ef9c7676e7493dc6fc42f3786141a81618391513928c956d857d1299b749f8 not found: ID does not exist" Apr 24 22:16:19.646559 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.646557 2567 scope.go:117] "RemoveContainer" containerID="b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384" Apr 24 22:16:19.646774 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.646757 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384"} err="failed to get container status \"b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384\": rpc error: code = NotFound desc = could not find container \"b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384\": container with ID starting with b3f1e5b5c97265295cdad4a8e49d6d66f5c3db879be6f428d7f77668d1562384 not found: ID does not exist" Apr 24 22:16:19.646816 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.646774 2567 scope.go:117] "RemoveContainer" containerID="8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf" Apr 24 22:16:19.655472 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.655448 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk"] Apr 24 22:16:19.655744 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.655717 2567 scope.go:117] "RemoveContainer" containerID="547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283" Apr 24 22:16:19.659854 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.659831 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5dfd79dc4f-r9ljk"] Apr 24 22:16:19.717670 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.717643 2567 scope.go:117] "RemoveContainer" containerID="8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf" Apr 24 22:16:19.717987 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:16:19.717969 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf\": container with ID starting with 8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf not found: ID does not exist" containerID="8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf" Apr 24 22:16:19.718069 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.717995 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf"} err="failed to get container status \"8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf\": rpc error: code = NotFound desc = could not find container \"8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf\": container with ID starting with 8e65d14d22d9fe9c1cdc931d7ad4e8062d31300a24d932f541d7fa5cddc86aaf not found: ID does not exist" Apr 24 22:16:19.718069 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.718021 2567 scope.go:117] "RemoveContainer" containerID="547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283" Apr 24 22:16:19.718261 ip-10-0-142-242 kubenswrapper[2567]: E0424 22:16:19.718246 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283\": container with ID starting with 547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283 not found: ID does not exist" containerID="547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283" Apr 24 22:16:19.718301 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:19.718263 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283"} err="failed to get container status \"547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283\": rpc error: code = NotFound desc = could not find container \"547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283\": container with ID starting with 547248d11e3ded9f1c0bb2c2633deb1bc9413214638ca9544aada7713a446283 not found: ID does not exist" Apr 24 22:16:20.296494 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:20.296462 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-nzp8n_5669e63d-6e50-438f-8114-00a0c65b7492/discovery/0.log" Apr 24 22:16:20.313715 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:20.313689 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d_2dd33b44-6aab-42e9-bdcf-ee18107e7f04/istio-proxy/0.log" Apr 24 22:16:20.418615 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:20.418565 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" path="/var/lib/kubelet/pods/6c6a183a-a7cb-4e66-9315-90710ff8a582/volumes" Apr 24 22:16:20.419039 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:20.419025 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" path="/var/lib/kubelet/pods/b3f0a542-ad41-48a7-ab9b-45a2926944ca/volumes" Apr 24 22:16:21.192654 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:21.192625 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-6kpsg_b3a28f01-9bb8-4227-9be7-74768d03fc42/manager/0.log" Apr 24 22:16:21.249076 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:21.249050 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-z9p66_8cd01911-c5b5-43d0-884c-5a9a8f2b0c94/manager/0.log" Apr 24 22:16:21.275853 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:21.275826 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-grrpw_194e2ea0-37e3-4ca8-81ad-841ff703c06f/manager/0.log" Apr 24 22:16:26.734064 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:26.734030 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vwv9h_d846199e-de26-4ab9-80f8-977e44e27d81/global-pull-secret-syncer/0.log" Apr 24 22:16:26.772570 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:26.772545 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dzcxv_54da82a7-17e0-4f28-b08b-90fd402af6ec/konnectivity-agent/0.log" Apr 24 22:16:26.921489 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:26.921456 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-242.ec2.internal_2b8950f4a211b4f3789a7a4ccb32fafe/haproxy/0.log" Apr 24 22:16:30.955392 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:30.955364 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-6kpsg_b3a28f01-9bb8-4227-9be7-74768d03fc42/manager/0.log" Apr 24 22:16:31.034813 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:31.034785 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-z9p66_8cd01911-c5b5-43d0-884c-5a9a8f2b0c94/manager/0.log" Apr 24 22:16:31.100868 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:31.100842 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-grrpw_194e2ea0-37e3-4ca8-81ad-841ff703c06f/manager/0.log" Apr 24 22:16:32.671524 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:32.671491 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vtd9b_079b286b-1432-4fa6-94b3-3b5066076fdf/node-exporter/0.log" Apr 24 22:16:32.698064 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:32.698039 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vtd9b_079b286b-1432-4fa6-94b3-3b5066076fdf/kube-rbac-proxy/0.log" Apr 24 22:16:32.723052 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:32.723028 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vtd9b_079b286b-1432-4fa6-94b3-3b5066076fdf/init-textfile/0.log" Apr 24 22:16:33.058646 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:33.058560 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-569bb_c762af5f-7dc9-4a77-ad83-ac3d3233b5d5/prometheus-operator/0.log" Apr 24 22:16:33.080627 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:33.080607 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-569bb_c762af5f-7dc9-4a77-ad83-ac3d3233b5d5/kube-rbac-proxy/0.log" Apr 24 22:16:33.120174 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:33.120149 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-djlzh_c90817f2-8aeb-4327-b2a3-1d7f4c796dbb/prometheus-operator-admission-webhook/0.log" Apr 24 22:16:34.518401 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:34.518372 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-m7fp2_86168a3b-d0b9-42cf-8c50-cfdab95abfc9/networking-console-plugin/0.log" Apr 24 22:16:35.286376 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286344 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk"] Apr 24 22:16:35.286763 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286751 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" Apr 24 22:16:35.286823 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286765 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" Apr 24 22:16:35.286823 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286775 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="llm-d-routing-sidecar" Apr 24 22:16:35.286823 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286781 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="llm-d-routing-sidecar" Apr 24 22:16:35.286823 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286799 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" containerName="storage-initializer" Apr 24 22:16:35.286823 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286804 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" containerName="storage-initializer" Apr 24 22:16:35.286823 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286810 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="storage-initializer" Apr 24 22:16:35.286823 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286816 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="storage-initializer" Apr 24 22:16:35.286823 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286825 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="storage-initializer" Apr 24 22:16:35.287073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286831 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="storage-initializer" Apr 24 22:16:35.287073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286838 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" containerName="main" Apr 24 22:16:35.287073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286843 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" containerName="main" Apr 24 22:16:35.287073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286848 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" Apr 24 22:16:35.287073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286853 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" Apr 24 22:16:35.287073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286915 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c6a183a-a7cb-4e66-9315-90710ff8a582" containerName="main" Apr 24 22:16:35.287073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286924 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cf01be2-27a7-4962-9b5a-d193c0b8d8f9" containerName="main" Apr 24 22:16:35.287073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286930 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="llm-d-routing-sidecar" Apr 24 22:16:35.287073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.286938 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3f0a542-ad41-48a7-ab9b-45a2926944ca" containerName="main" Apr 24 22:16:35.290077 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.290062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.292297 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.292277 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wmq6v\"/\"default-dockercfg-4l88c\"" Apr 24 22:16:35.292380 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.292324 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wmq6v\"/\"kube-root-ca.crt\"" Apr 24 22:16:35.293233 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.293218 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wmq6v\"/\"openshift-service-ca.crt\"" Apr 24 22:16:35.297749 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.297167 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk"] Apr 24 22:16:35.456281 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.456251 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-lib-modules\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.456281 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.456285 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-proc\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.456467 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.456312 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx2cs\" (UniqueName: \"kubernetes.io/projected/cc581df9-c37c-4fb4-a471-16fc4b9699b1-kube-api-access-sx2cs\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.456467 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.456397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-podres\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.456467 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.456429 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-sys\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.557358 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.557280 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-lib-modules\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.557358 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.557309 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-proc\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.557358 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.557338 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sx2cs\" (UniqueName: \"kubernetes.io/projected/cc581df9-c37c-4fb4-a471-16fc4b9699b1-kube-api-access-sx2cs\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.557889 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.557371 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-podres\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.557889 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.557390 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-sys\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.557889 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.557389 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-proc\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.557889 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.557448 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-lib-modules\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.557889 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.557487 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-sys\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.557889 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.557502 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc581df9-c37c-4fb4-a471-16fc4b9699b1-podres\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.559514 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.559497 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57dc596fcd-ql9q5_74d79a05-452e-4127-b595-ba3ced488668/console/0.log" Apr 24 22:16:35.574806 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.574783 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx2cs\" (UniqueName: \"kubernetes.io/projected/cc581df9-c37c-4fb4-a471-16fc4b9699b1-kube-api-access-sx2cs\") pod \"perf-node-gather-daemonset-nmqdk\" (UID: \"cc581df9-c37c-4fb4-a471-16fc4b9699b1\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.601015 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.600988 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:35.619956 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.619933 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-c6p8l_00358327-c969-49bb-a961-6bbb496280a7/download-server/0.log" Apr 24 22:16:35.732811 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.732788 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk"] Apr 24 22:16:35.734177 ip-10-0-142-242 kubenswrapper[2567]: W0424 22:16:35.734148 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcc581df9_c37c_4fb4_a471_16fc4b9699b1.slice/crio-a6cda5f78cf072fd9afb1dcf692c486e4f3ae8ee05f2bc9b8596d8ae91202c2a WatchSource:0}: Error finding container a6cda5f78cf072fd9afb1dcf692c486e4f3ae8ee05f2bc9b8596d8ae91202c2a: Status 404 returned error can't find the container with id a6cda5f78cf072fd9afb1dcf692c486e4f3ae8ee05f2bc9b8596d8ae91202c2a Apr 24 22:16:35.735832 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:35.735816 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:16:36.678882 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:36.678843 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" event={"ID":"cc581df9-c37c-4fb4-a471-16fc4b9699b1","Type":"ContainerStarted","Data":"b5eb2fabf38b1f88a4cfe5756c47b435ba7c623f77dc4349d7ac4f9e4c2dee07"} Apr 24 22:16:36.678882 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:36.678884 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" event={"ID":"cc581df9-c37c-4fb4-a471-16fc4b9699b1","Type":"ContainerStarted","Data":"a6cda5f78cf072fd9afb1dcf692c486e4f3ae8ee05f2bc9b8596d8ae91202c2a"} Apr 24 22:16:36.679332 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:36.678988 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:36.697164 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:36.697117 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" podStartSLOduration=1.697104336 podStartE2EDuration="1.697104336s" podCreationTimestamp="2026-04-24 22:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:16:36.695130398 +0000 UTC m=+2934.793542914" watchObservedRunningTime="2026-04-24 22:16:36.697104336 +0000 UTC m=+2934.795516849" Apr 24 22:16:36.890906 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:36.890875 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jhpmf_30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e/dns/0.log" Apr 24 22:16:36.915137 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:36.915109 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jhpmf_30a0f1a2-0d31-4a4e-9ed8-1d20eae0f62e/kube-rbac-proxy/0.log" Apr 24 22:16:37.019867 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:37.019791 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nzxhn_6c3b2011-26ac-4109-9898-ad37c3d322dd/dns-node-resolver/0.log" Apr 24 22:16:37.497660 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:37.497631 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7584f55f55-p88k5_ad19df44-5b36-4c1c-a39f-a3779df1e10a/registry/0.log" Apr 24 22:16:37.567531 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:37.567503 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-whp9g_b888a29e-e580-4114-8441-9109c5db53fd/node-ca/0.log" Apr 24 22:16:38.411975 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:38.411949 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-nzp8n_5669e63d-6e50-438f-8114-00a0c65b7492/discovery/0.log" Apr 24 22:16:38.434461 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:38.434424 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-q7m5d_2dd33b44-6aab-42e9-bdcf-ee18107e7f04/istio-proxy/0.log" Apr 24 22:16:39.003690 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:39.003651 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xk55r_caa2f14d-7161-4066-9e78-ae036846d1b5/serve-healthcheck-canary/0.log" Apr 24 22:16:39.520667 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:39.520637 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sxbgk_dfef123c-0aff-4e29-9992-a01cd35408cb/kube-rbac-proxy/0.log" Apr 24 22:16:39.543965 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:39.543939 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sxbgk_dfef123c-0aff-4e29-9992-a01cd35408cb/exporter/0.log" Apr 24 22:16:39.570768 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:39.570746 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sxbgk_dfef123c-0aff-4e29-9992-a01cd35408cb/extractor/0.log" Apr 24 22:16:42.258985 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:42.258956 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-z7n4k_4eec04f0-c060-4069-9a58-3942ba1a27cc/openshift-lws-operator/0.log" Apr 24 22:16:42.692676 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:42.692651 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-nmqdk" Apr 24 22:16:42.840366 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:42.840340 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-6ff58d5594-fk9lc_2b925942-e3d0-42fc-9b55-537e7da940f4/manager/0.log" Apr 24 22:16:42.860733 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:42.860705 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-cm4l4_d2b3a923-58c7-4ed9-8771-ad1bcc7de4fc/server/0.log" Apr 24 22:16:43.121210 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:43.121175 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-72h9j_81162468-e2f1-4fcc-bd24-61e23bdfd1c6/manager/0.log" Apr 24 22:16:43.170162 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:43.170120 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-6mhj8_7e10191e-1f10-4866-b32c-e0e834643954/seaweedfs/0.log" Apr 24 22:16:48.079496 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:48.079464 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ss8cr_274f17a0-ae47-42da-bc42-494b1f9aeed6/migrator/0.log" Apr 24 22:16:48.102817 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:48.102787 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ss8cr_274f17a0-ae47-42da-bc42-494b1f9aeed6/graceful-termination/0.log" Apr 24 22:16:49.508703 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:49.508667 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2f989_7e34f57d-6789-43e3-8a4f-f5b55dd1ace1/kube-multus/0.log" Apr 24 22:16:49.904809 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:49.904784 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5xkp_99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae/kube-multus-additional-cni-plugins/0.log" Apr 24 22:16:49.928129 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:49.928100 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5xkp_99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae/egress-router-binary-copy/0.log" Apr 24 22:16:49.952013 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:49.951989 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5xkp_99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae/cni-plugins/0.log" Apr 24 22:16:49.974277 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:49.974256 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5xkp_99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae/bond-cni-plugin/0.log" Apr 24 22:16:49.996205 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:49.996183 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5xkp_99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae/routeoverride-cni/0.log" Apr 24 22:16:50.019670 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:50.019647 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5xkp_99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae/whereabouts-cni-bincopy/0.log" Apr 24 22:16:50.042178 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:50.042149 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x5xkp_99a3f1ef-3d99-4a08-ab2a-bf05ceca8dae/whereabouts-cni/0.log" Apr 24 22:16:50.173617 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:50.173525 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x6s8x_52f8223b-f29e-4bac-bf1e-475d1a24a90c/network-metrics-daemon/0.log" Apr 24 22:16:50.199342 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:50.199314 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x6s8x_52f8223b-f29e-4bac-bf1e-475d1a24a90c/kube-rbac-proxy/0.log" Apr 24 22:16:51.717712 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:51.717686 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxnmf_5dbf1909-aeaf-429a-9020-a9384e13a292/ovn-controller/0.log" Apr 24 22:16:51.750073 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:51.750047 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxnmf_5dbf1909-aeaf-429a-9020-a9384e13a292/ovn-acl-logging/0.log" Apr 24 22:16:51.770611 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:51.770573 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxnmf_5dbf1909-aeaf-429a-9020-a9384e13a292/kube-rbac-proxy-node/0.log" Apr 24 22:16:51.795122 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:51.795096 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxnmf_5dbf1909-aeaf-429a-9020-a9384e13a292/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:16:51.818207 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:51.818163 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxnmf_5dbf1909-aeaf-429a-9020-a9384e13a292/northd/0.log" Apr 24 22:16:51.840847 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:51.840828 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxnmf_5dbf1909-aeaf-429a-9020-a9384e13a292/nbdb/0.log" Apr 24 22:16:51.863361 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:51.863338 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxnmf_5dbf1909-aeaf-429a-9020-a9384e13a292/sbdb/0.log" Apr 24 22:16:51.964176 ip-10-0-142-242 kubenswrapper[2567]: I0424 22:16:51.964146 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mxnmf_5dbf1909-aeaf-429a-9020-a9384e13a292/ovnkube-controller/0.log"